The muHVT package is a collection of R functions for vector quantization and construction of hierarchical voronoi tessellations as a data visualization tool to visualize cells using quantization. The hierarchical cells are computed using Hierarchical K-means or K-medoids where a quantization threshold governs the levels in the hierarchy for a set \(k\) parameter (the maximum number of cells at each level). The package is particularly helpful to visualize rich mutlivariate data.
This package additionally provides functions for computing the Sammon’s projection and plotting the heat map of the variables on the tiles of the tessellations.
The muHVT process involves three steps:
This package can perform vector quantization using the following algorithms -
The second and third steps are iterated until a predefined number of iterations is reached or the clusters converge. The runtime for the algorithm is O(n).
The second and third steps are iterated until a predefined number of iterations is reached or the clusters converge. The runtime for the algorithm is O(k * (n-k)^2).
The algorithm divides the dataset recursively into cells using \(k-means\) or \(k-medoids\) algorithm. The maximum number of subsets are decided by setting \(nclust\) to, say five, in order to divide the dataset into maximum of five subsets. These five subsets are further divided into five subsets(or less), resulting in a total of twenty five (5*5) subsets. The recursion terminates when the cells either contain less than three data point or a stop criterion is reached. In this case, the stop criterion is set to when the cell error exceeds the quantization threshold.
The steps for this method are as follows :
The stop criterion is when the quantization error of a cell satisfies one of the below conditions
Let us try to understand quantization error with an example.
Figure 1: The Voronoi tessellation for level 1 shown for the 5 cells with the points overlayed
An example of a 2 dimensional VQ is shown above.
In the above image, we can see 5 cells with each cell containing a certain number of points. The centroid for each cell is shown in blue. These centroids are also known as codewords since they represent all the points in that cell. The set of all codewords is called a codebook.
Now we want to calculate quantization error for each cell. For the
sake of simplicity, let’s consider only one cell having centroid
A and m data points \(F_i\) for calculating quantization
error.
For each point, we calculate the distance between the point and the centroid.
\[ d = ||A - F_i||_{p} \]
In the above equation, p = 1 means L1_Norm distance
whereas p = 2 means L2_Norm distance. In the package, the
L1_Norm distance is chosen by default. The user can pass
either L1_Norm, L2_Norm or a custom function
to calculate the distance between two points in n dimensions.
\[QE = \max(||A-F_i||_{p})\]
where
Now, we take the maximum calculated distance of all m points. This
gives us the furthest distance of a point in the cell from the centroid,
which we refer to as Quantization Error. If the
Quantization Error is higher than the given threshold, the
centroid/codevector is not a good representation for the points in the
cell. Now we can perform further Vector Quantization on these points and
repeat the above steps.
Please note that the user can select mean or max to calculate the
Quantization Error. The custom function takes a vector of m value (where
each value is a distance between point in n dimensions and
centroids) and returns a single value which is the Quantization Error
for the cell.
If we select mean as the error metric, the above
Quantization Error equation will look like this :
\[QE = \frac{1}{m}\sum_{i=1}^m||A-F_i||_{p}\]
where
A Voronoi diagram is a way of dividing space into a number of regions. A set of points (called seeds, sites, or generators) is specified beforehand and for each seed, there will be a corresponding region consisting of all points within proximity of that seed. These regions are called Voronoi cells. It is complementary to Delaunay triangulation.
Sammon’s projection is an algorithm that maps a high-dimensional space to a space of lower dimensionality while attempting to preserve the structure of inter-point distances in the projection. It is particularly suited for use in exploratory data analysis and is usually considered a non-linear approach since the mapping cannot be represented as a linear combination of the original variables. The centroids are plotted in 2D after performing Sammon’s projection at every level of the tessellation.
Denoting the distance between \(i^{th}\) and \(j^{th}\) objects in the original space by \(d_{ij}^*\), and the distance between their projections by \(d_{ij}\). Sammon’s mapping aims to minimize the below error function, which is often referred to as Sammon’s stress or Sammon’s error
\[E=\frac{1}{\sum_{i<j} d_{ij}^*}\sum_{i<j}\frac{(d_{ij}^*-d_{ij})^2}{d_{ij}^*}\]
The minimization of this can be performed either by gradient descent, as proposed initially, or by other means, usually involving iterative methods. The number of iterations need to be experimentally determined and convergent solutions are not always guaranteed. Many implementations prefer to use the first Principal Components as a starting configuration.
In this package, we use sammons from the package
MASS to project higher dimensional data to a 2D space. The
function hvq called from the HVT function
returns hierarchical quantized data which will be the input for
construction of the tessellations. The data is then represented in 2D
coordinates and the tessellations are plotted using these coordinates as
centroids. We use the package deldir for this purpose. The
deldir package computes the Delaunay triangulation (and
hence the Dirichlet or Voronoi tessellation) of a planar point set
according to the second (iterative) algorithm of Lee and Schacter. For
subsequent levels, transformation is performed on the 2D coordinates to
get all the points within its parent tile. Tessellations are plotted
using these transformed points as centroids. The lines in the
tessellations are chopped in places so that they do not protrude outside
the parent polygon. This is done for all the subsequent levels.
The user can provide an absolute or relative path in the cell below
to access the data from his/her computer. User can set
import_data_from_local variable to TRUE to upload dataset
from local.
Note: For this notebook import_data_from_local
has been set to FALSE as we are simulating a dataset in next
section.
import_data_from_local = FALSE # expects logical input
file_name <- "hotel_data_train.csv" ## Single_hotel_Time_Series.csv,HotelPanel_100.csv
file_path <- "./sample_dataset/"
# Loading the data in the Rstudio environment
# Please change the path in the code line below to the path location of the .csv file
if(import_data_from_local){
file_load <- paste0(file_path, file_name)
dataset_updated <- as.data.frame(fread(file_load))
if(nrow(dataset_updated) > 0){
paste0("File ", file_name, " having ", nrow(dataset_updated), " row(s) and ", ncol(dataset_updated), " column(s)", " imported successfully. ") %>% cat("\n")
# Round only the numeric columns in dataset
dataset_updated <- dataset_updated %>% mutate_if(is.numeric, round, digits = 4)
paste0("Code chunk executed successfully. Below table showing first 10 row(s) of the dataset.") %>% cat("\n")
# Display imported dataset
dataset_updated %>% head(10) %>%
as.data.frame() %>%
DT::datatable(options = options, rownames = TRUE)
}
} In this section, we will use a simulated dataset. If you are not
using this option set simulate_dataset to FALSE. Given
below is a simulated dataset that contains 2500 observations and 5
features. In this step, a Normal Multivariate cross-sectional dataset is
simulated with 5 features using the rnorm() function
(N=2500,mean=0,sd=1) such that every feature has normal distribution and
linear combination of its k components also has a univariate normal
distribution.
Here, we load the data and store into a variable
dataset_updated.
simulate_dataset= TRUE
set.seed(257)
dataset_updated <- multiNormalDist(2500,5)
names(dataset_updated) <- paste0("Variable",1:ncol(dataset_updated))
if(nrow(dataset_updated) > 0){
paste0( "Dataset having ", nrow(dataset_updated), " row(s) and ", ncol(dataset_updated), " column(s)", " simulated successfully. ") %>% cat("\n")
# Round only the numeric columns in dataset
dataset_updated <- dataset_updated %>% mutate_if(is.numeric, round, digits = 4)
paste0("Code chunk executed successfully. The table below is showing first 10 row(s) of the dataset.") %>% cat("\n")
# Display imported dataset
dataset_updated %>% head(10) %>%
as.data.frame() %>%
DT::datatable(options = options, rownames = TRUE)
}Dataset having 2500 row(s) and 5 column(s) simulated successfully.
Code chunk executed successfully. The table below is showing first 10 row(s) of the dataset.
The table below shows summary of all (numeric & categorical) columns of Data set
df <- do.call(cbind, lapply(dataset_updated, summary)) %>%
data.frame() %>%
tibble::rownames_to_column("Metrics")
DT::datatable(df %>%
mutate_if(is.numeric, round,4) %>%
head(),
options = options,
rownames = FALSE)rm(df)In the below section we can see the structure of the data.
dataset_updated %>% str()'data.frame': 2500 obs. of 5 variables:
$ Variable1: num 0.483 0.881 -0.367 -1.476 1.167 ...
$ Variable2: num 1.004 0.786 0.518 0.331 -2.055 ...
$ Variable3: num -0.587 -0.746 -0.715 -0.238 -0.756 ...
$ Variable4: num 0.647 -1.025 -1.326 -0.197 -0.142 ...
$ Variable5: num 0.67 1.318 0.158 1.058 0.86 ...
The cell below will allow user to drop irrelevant column.
########################################################################################
################################## User Input Needed ###################################
########################################################################################
# Add column names which you want to remove
want_to_delete_column <- "no"
del_col<-c("hotel_id")
if(want_to_delete_column == "yes"){
dataset_updated <- dataset_updated[ , !(names(dataset_updated) %in% del_col)]
print("Code chunk executed successfully. Overview of data types after removed selected columns")
str( dataset_updated)
}else{
paste0("No Columns removed. Please enter column name if you want to remove that column") %>% cat("\n")
}No Columns removed. Please enter column name if you want to remove that column
The code below contains a user defined function to rename or reformat any column that the user chooses.
########################################################################################
################################## User Input Needed ###################################
########################################################################################
# convert the column names to lower case
colnames( dataset_updated) <- colnames( dataset_updated) %>% casefold()
## rename column ?
want_to_rename_column <- "no" ## type "yes" if you want to rename a column
## renaming a column of a dataset
rename_col_name <- "fac_id" ## use small letters
rename_col_name_to <- "unique_id"
if(want_to_rename_column == "yes"){
names( dataset_updated)[names( dataset_updated) == rename_col_name] <- rename_col_name_to
}
# remove space, comma, dot from column names
spaceless <- function(x) {colnames(x) <- gsub(pattern = "[^[:alnum:]]+",
replacement = ".",
names(x));x}
dataset_updated <- spaceless( dataset_updated)
## below is the dataset summary
paste0("Successfully converted the column names to lower case and check the renamed column name if you changed") %>% cat("\n")Successfully converted the column names to lower case and check the renamed column name if you changed
str( dataset_updated) ## showing summary for updated 'data.frame': 2500 obs. of 5 variables:
$ variable1: num 0.483 0.881 -0.367 -1.476 1.167 ...
$ variable2: num 1.004 0.786 0.518 0.331 -2.055 ...
$ variable3: num -0.587 -0.746 -0.715 -0.238 -0.756 ...
$ variable4: num 0.647 -1.025 -1.326 -0.197 -0.142 ...
$ variable5: num 0.67 1.318 0.158 1.058 0.86 ...
The section allows the user to change the data type of columns of his choice.
########################################################################################
################################## User Input Needed ###################################
########################################################################################
# If you want to change column type, change a below variable value to "yes"
want_to_change_column_type <- "no"
# you can change column type into numeric or character only
change_column_to_type <- "character" ## numeric
if(want_to_change_column_type == "yes" && change_column_to_type == "character"){
########################################################################################
################################## User Input Needed ###################################
########################################################################################
select_columns <- c("panel_var") ###### Add column names you want to change here #####
dataset_updated[select_columns]<- sapply( dataset_updated[select_columns],as.character)
paste0("Code chunk executed successfully. Datatype of selected column(s) have been changed into numerical.")
#str( dataset_updated)
}else if(want_to_change_column_type == "yes" && change_column_to_type == "numeric"){
select_columns <- c('gearbox_oil_temperature')
dataset_updated[select_columns]<- sapply( dataset_updated[select_columns],as.numeric)
paste0("Code chunk executed successfully. Datatype of selected column(s) have been changed into categorical.")
#str( dataset_updated)
}else{
paste0("Datatype of columns have not been changed.") %>% cat("\n")
}Datatype of columns have not been changed.
dataset_updated<-do.call(data.frame, dataset_updated)
str( dataset_updated)'data.frame': 2500 obs. of 5 variables:
$ variable1: num 0.483 0.881 -0.367 -1.476 1.167 ...
$ variable2: num 1.004 0.786 0.518 0.331 -2.055 ...
$ variable3: num -0.587 -0.746 -0.715 -0.238 -0.756 ...
$ variable4: num 0.647 -1.025 -1.326 -0.197 -0.142 ...
$ variable5: num 0.67 1.318 0.158 1.058 0.86 ...
Presence of duplicate observations can be misleading, this sections helps get rid of such rows in the datasets.
want_to_remove_duplicates <- "yes" ## type "no" for choosing to not remove duplicates
## removing duplicate observation if present in the dataset
if(want_to_remove_duplicates == "yes"){
dataset_updated <- dataset_updated %>% unique()
paste0("Code chunk executed successfully, duplicates if present successfully removed. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
cat("\n")
str( dataset_updated) ## showing summary for updated dataset
} else{
paste0("Code chunk executed successfully, NO duplicates were removed") %>% print()
}[1] "Code chunk executed successfully, duplicates if present successfully removed. Updated dataset has 2500 row(s) and 5 column(s)"
'data.frame': 2500 obs. of 5 variables:
$ variable1: num 0.483 0.881 -0.367 -1.476 1.167 ...
$ variable2: num 1.004 0.786 0.518 0.331 -2.055 ...
$ variable3: num -0.587 -0.746 -0.715 -0.238 -0.756 ...
$ variable4: num 0.647 -1.025 -1.326 -0.197 -0.142 ...
$ variable5: num 0.67 1.318 0.158 1.058 0.86 ...
# Return the column type
CheckColumnType <- function(dataVector) {
#Check if the column type is "numeric" or "character" & decide type accordingly
if (class(dataVector) == "integer" || class(dataVector) == "numeric") {
columnType <- "numeric"
} else { columnType <- "character" }
#Return the result
return(columnType)
}
### Loading the list of numeric columns in variable
numeric_cols <<- colnames( dataset_updated)[unlist(sapply( dataset_updated,
FUN = function(x){ CheckColumnType(x) == "numeric"}))]
### Loading the list of categorical columns in variable
cat_cols <- colnames( dataset_updated)[unlist(sapply( dataset_updated,
FUN = function(x){
CheckColumnType(x) == "character"|| CheckColumnType(x) == "factor"}))]
### Removing Date Column from the list of categorical column
paste0("Code chunk executed successfully, list of numeric and categorical variables created.") %>% cat()Code chunk executed successfully, list of numeric and categorical variables created.
paste0("\n\n Numerical Column(s): \n Count : ", length(numeric_cols), "\n") %>% cat()
Numerical Column(s):
Count : 5
paste0(numeric_cols) %>% print()[1] "variable1" "variable2" "variable3" "variable4" "variable5"
paste0("\n Categorical Column(s): \n Count : ", length(cat_cols), "\n") %>% cat()
Categorical Column(s):
Count : 0
paste0(cat_cols) %>% print()character(0)
In this section, the dataset can be filtered for required row(s) for further analysis.
want_to_filter_dataset <- "no" ## type "yes" in case you want to filter
filter_col <- "building.type" ## Enter Column name to filter
filter_val <- "3" ## Enter Value to exclude for the column selected
if(want_to_filter_dataset == "yes"){
dataset_updated <- filter_at( dataset_updated
, vars(contains(filter_col))
, all_vars(. != filter_val))
paste0("Code chunk executed successfully, dataset filtered successfully on required columns. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
cat("\n")
str( dataset_updated) ## showing summary for updated dataset
} else{
paste0("Code chunk executed successfully, entire dataset is available for analysis.") %>% print()
}[1] "Code chunk executed successfully, entire dataset is available for analysis."
Missing values in the training data can lead to a biased model because we have not analyzed the behavior and relationship of those values with other variables correctly. It can lead to a wrong prediction or classification. Missing values can be of 3 types:
na_total <- sum(is.na( dataset_updated))/prod(dim( dataset_updated))
if(na_total == 0){
paste0("In the uploaded dataset, there is no missing value") %>% cat("\n")
}else{
na_percentage <- paste0(sprintf(na_total*100, fmt = '%#.2f'),"%")
paste0("Percentage of missing value in entire dataset is ",na_percentage) %>% cat("\n")
}In the uploaded dataset, there is no missing value
The following code is to visualize the missing values (if any) using bar chart.
gg_miss_upset function are using to visualize the patterns of missingness, or rather the combinations of missingness across cases.
This function gives us(if any missing value present):
# Below code gives you missing value in each column
paste0("Number of missing value in each column") %>% cat("\n")Number of missing value in each column
print(sapply( dataset_updated, function(x) sum(is.na(x))))variable1 variable2 variable3 variable4 variable5
0 0 0 0 0
missing_col_names <- names(which(sapply( dataset_updated, anyNA)))
total_na <- sum(is.na( dataset_updated))
# visualize the missing values (if any) using bar chart
if(total_na > 0 && length(missing_col_names) > 1){
paste0("Code chunk executed successfully. Visualizing the missing values using bar chart") %>% cat("\n")
gg_miss_upset( dataset_updated,
nsets = 10,
nintersects = NA)
}else if(total_na > 0){
dataset_updated %>%
DataExplorer::plot_missing()
# paste0("Code chunk executed successfully. Only one column ",missing_col_names," have missing values ", sum(is.na( dataset_updated)),".") %>% cat("\n")
}else{
paste("Code chunk executed successfully. No missing value exist.") %>% cat("\n")
}Code chunk executed successfully. No missing value exist.
In this section user can make decisions, how to tackle missing values in dataset. Both column(s) and row(s) can be removed in the following dataset should the user choose to do so.
Note for missing value imputation, please refer to data wrangling brick.
The below code accepts user input and deletes the specified column.
########################################################################################
################################## User Input Needed ###################################
########################################################################################
# OR do you want to drop column specific column
drop_cloumn_name_na <- "yes" ## type "yes" to drop column(s)
# write column name that you want to drop
drop_column_name <- c("building.type")
if(drop_cloumn_name_na == "yes"){
names_df=names( dataset_updated) %in% drop_column_name
dataset_updated <- dataset_updated[ , which(!names( dataset_updated) %in% drop_column_name)]
paste0("Code chunk executed, selected column(s) dropped successfully.") %>% print()
cat("\n")
str( dataset_updated)
} else {
paste0("Code chunk executed, missing value not removed (if any).") %>% cat("\n")
cat("\n")
}[1] "Code chunk executed, selected column(s) dropped successfully."
'data.frame': 2500 obs. of 5 variables:
$ variable1: num 0.483 0.881 -0.367 -1.476 1.167 ...
$ variable2: num 1.004 0.786 0.518 0.331 -2.055 ...
$ variable3: num -0.587 -0.746 -0.715 -0.238 -0.756 ...
$ variable4: num 0.647 -1.025 -1.326 -0.197 -0.142 ...
$ variable5: num 0.67 1.318 0.158 1.058 0.86 ...
The below code accepts user input and deletes rows.
# Do you want to drop row(s) containing "NA"
drop_row <- "no" ## type "yes" to delete missing value observations
if(drop_row == "yes"){
# imputing blank with NAs and removing all rows containing NAs
# dataset_updated[ dataset_updated == ""] <- NA
# removing missing values from data
dataset_updated <- dataset_updated %>% na.omit()
paste0("Code chunk executed, missing values successfully identified and removed. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
cat("\n")
# str( dataset_updated)
} else{
paste0("Code chunk executed, missing value(s) not removed (if any).") %>% cat("\n")
cat("\n")
}Code chunk executed, missing value(s) not removed (if any).
This technique bins all categorical values as either 1 or 0. It is used for categorical variables with 2 classes. This is done because classification models can only handle features that have numeric values.
Shown below is the length of unique values in each categorical column
cat_cols <-
colnames(dataset_updated)[unlist(sapply(
dataset_updated,
FUN = function(x) {
CheckColumnType(x) == "character" ||
CheckColumnType(x) == "factor"
}
))]
apply(dataset_updated[cat_cols], 2, function(x) {
length(unique(x))
})integer(0)
Selecting categorical columns with smaller unique values for dummification
########################################################################################
################################## User Input Needed ###################################
########################################################################################
# Do you want to dummify the categorical variables?
dummify_cat <- FALSE ## TRUE,FALSE
# Select the columns on which dummification is to be performed
dum_cols <- c("location.type","class")
########################################################################################[1] "One-Hot Encoding was not performed on dataset."
# Check data for singularity
singular_cols <- sapply(dataset_updated,function(x) length(unique(x))) %>% # convert to dataframe
data.frame(Unique_n = .) %>% dplyr::filter(Unique_n == 1) %>%
rownames() %>% data.frame(Constant_Variables = .)
if(nrow(singular_cols) != 0) {
singular_cols %>% DT::datatable()
} else {
paste("There are no singular columns in the dataset") %>% htmltools::HTML()
}# Display variance of columns
dataset_updated %>% dplyr::summarise_if(is.numeric, var) %>% t() %>%
data.frame() %>% round(3) %>% DT::datatable(colnames = "Variance")numeric_cols=as.vector(sapply(dataset_updated, is.numeric))
dataset_updated=dataset_updated[,numeric_cols]
colnames(dataset_updated)[1] "variable1" "variable2" "variable3" "variable4" "variable5"
All further operations will be performed on the following dataset.
nums <- colnames(dataset_updated)[unlist(lapply(dataset_updated, is.numeric))]
cat(paste0("Final data frame contains ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s).","Code chunk executed. Below table showing first 10 row(s) of the dataset."))Final data frame contains 2500 row(s) and 5 column(s).Code chunk executed. Below table showing first 10 row(s) of the dataset.
dataset_updated <- dataset_updated %>% mutate_if(is.numeric, round, digits = 4)
dataset_updated %>% head(10) %>%
as.data.frame() %>%
DT::datatable(options = options, rownames = TRUE) DT::datatable(
dataset_updated %>%
select_if(., is.numeric) %>%
skimr::skim() %>%
mutate_if(is.numeric, round, digits = 4) %>%
rename_at(.vars = vars(starts_with("skim_")), .funs = funs(sub("skim_", "", .))) %>%
rename_at(.vars = vars(starts_with("numeric.")), .funs = funs(sub("numeric.", "", .))) %>%
select(-c(type, n_missing, complete_rate)) %>%
mutate(n_row = nrow(dataset_updated),
n_missing = rowSums(is.na(.))
# ,n_non_missing = n_row - n_missing
) ,
selection = "none",
# filter = "top",
class = 'cell-border stripe',
escape = FALSE,
options = options,
callback = htmlwidgets::JS(
"var tips = ['Index showing column number',
'Columns used for building the HVT model',
'Histogram for individual column',
'Number of records for each feature',
'Number of missing (NA) values for each feature',
'Mean of individual column',
'Standard deviation of individual column',
'0th Percentile means that the values are smaller than all 100% of the rows',
'25th Percentile means that the values are bigger than 25% and smaller than only 75% of the rows',
'50th Percentile means that the values are bigger than 50% and smaller than only 50% of the rows',
'75th Percentile means that the values are bigger than 75% and smaller than only 25% of the rows',
'100th Percentile means that the values are bigger than 100% of the rows'],
header = table.columns().header();
for (var i = 0; i < tips.length; i++) {
$(header[i]).attr('title', tips[i]);
}"
)
)Shown below is the distribution of all the variables in the dataset.
eda_cols <- names(dataset_updated)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))In this section, we plot box plots for each numeric column in the dataset across panels. These plots will display the median and Inter Quartile Range of each column at a panel level.
## the below function helps plotting quantile outlier plot for multiple variables
quantile_outlier_plots_fn <- function(data, outlier_check_var, data_cat = dataset_updated[, cat_cols], numeric_cols = numeric_cols){
# lower threshold
lower_threshold <- stats::quantile(data[, outlier_check_var], .25,na.rm = T) - 1.5*(stats::IQR(data[, outlier_check_var], na.rm = T))
# upper threshold
upper_threshold <- stats::quantile(data[,outlier_check_var],.75,na.rm = T) + 1.5*(stats::IQR(data[,outlier_check_var],na.rm = T))
# Look for outliers based on thresholds
data$QuantileOutlier <- data[,outlier_check_var] > upper_threshold | data[,outlier_check_var] < lower_threshold
# Plot box plot
quantile_outlier_plot <- ggplot2::ggplot(data, ggplot2::aes(x="", y = data[,outlier_check_var])) +
ggplot2::geom_boxplot(fill = 'blue',alpha=0.7) +
ggplot2::theme_bw() +
ggplot2::theme(panel.border=ggplot2::element_rect(size=0.1),panel.grid.minor.x=ggplot2::element_blank(),panel.grid.major.x=ggplot2::element_blank(),legend.position = "bottom") + ggplot2::ylab(outlier_check_var) + ggplot2::xlab("")
data <- cbind(data[, !names(data) %in% c("QuantileOutlier")] %>% round(2), outlier = data[, c("QuantileOutlier")])
data <- cbind(data, data_cat)
return(list(quantile_outlier_plot, data, lower_threshold, upper_threshold))
}## the below code gives the interactive plot for Quantile Outlier analysis for numerical variables
box_plots <- list()
for (x in names(dataset_updated)) {
box_plots[[x]] <- quantile_outlier_plots_fn(data = dataset_updated, outlier_check_var = x)[[1]]
}
gridExtra::grid.arrange(grobs = box_plots, ncol = 3)In this section we are calculating pearson correlation which is a bivariate correlation value measuring the linear correlation between two numeric columns. The output shown is a matrix.
Let us first split the data into train and test. We will use 80% of the data as train and remaining as test.
## 80% of the sample size
smp_size <- floor(0.80 * nrow(dataset_updated))
## set the seed to make your partition reproducible
set.seed(279)
train_ind <- sample(seq_len(nrow(dataset_updated)), size = smp_size)
dataset_updated_train <- dataset_updated[train_ind, ]
dataset_updated_test <- dataset_updated[-train_ind, ]The train data contains 2000 rows and 5 columns. The test data contains 500 rows and 5 columns.
eda_cols <- names(dataset_updated_train)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated_train, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))eda_cols <- names(dataset_updated_test)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated_test, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))In HVT, we use sammons from the package
MASS to project higher dimensional data to a 2D space. The
function hvq called from the HVT function
returns hierarchical quantized data which will be the input for
construction of the tessellations. The data is then represented in 2D
coordinates and the tessellations are plotted using these coordinates as
centroids. We use the package deldir for this purpose. The
deldir package computes the Delaunay triangulation (and
hence the Dirichlet or Voronoi tessellation) of a planar point set
according to the second (iterative) algorithm of Lee and Schacter. For
subsequent levels, transformation is performed on the 2D coordinates to
get all the points within its parent tile. Tessellations are plotted
using these transformed points as centroids. The lines in the
tessellations are chopped in places so that they do not protrude outside
the parent polygon. This is done for all the subsequent levels.
Let us try to understand the HVT function first.
muHVT::HVT(
dataset,
nclust,
depth,
quant.err,
projection.scale,
normalize = T,
distance_metric = c("L1_Norm", "L2_Norm"),
error_metric = c("mean", "max"),
quant_method = c("kmeans", "kmedoids"),
diagnose = TRUE,
hvt_validation = FALSE,
train_validation_split_ratio = 0.8,
...
)
Each of the parameters have been explained below
dataset - A dataframe with numeric
columns
nclust - An integer indicating the
number of cells per hierarchy (level)
depth - An integer indicating the
number of levels. (1 = No hierarchy, 2 = 2 levels, etc …)
quant.error - A number indicating
the quantization error threshold. A cell will only breakdown into
further cells if the quantization error of the cell is above the defined
quantization error threshold
distance_metric - The distance
metric can be L1_Norm or L2_Norm.
L1_Norm is selected by default. The distance metric is used
to calculate the distance between an n dimensional point
and centroid. The user can also pass a custom function to calculate this
distance
error_metric - The error metric can
be mean or max. max is selected
by default. max will return the max of m
values and mean will take mean of m values
where each value is a distance between a point and centroid of the cell.
Moreover, the user can also pass a custom function to calculate the
error metric
quant_method - The quantization
method can be kmeans or kmedoids.
kmeans is selected by default
normalize - A logical value
indicating whether the columns in your dataset need to be normalized.
Default value is TRUE. The algorithm supports Z-score
normalization
diagnose - A logical value
indicating whether user wants to perform diagnostics on the model.
Default value is TRUE.
hvt_validation - A logical value
indicating whether user wants to holdout a validation set and find mean
absolute deviation of the validation points from the centroid. Default
value is FALSE.
train_validation_split_ratio - A
numeric value indicating train validation split ratio. This argument is
only used when hvt_validation has been set to TRUE. Default value for
the argument is 0.8
More information on building an HVT model at different levels and visualizing the output can be found here
In the section below, we build a Level 1 HVT model. The number of
clusters n_clust is set to 1035 to achieve 95%
compression.
set.seed(240)
hvt.results <- list()
hvt.results <- muHVT::HVT(dataset_updated_train,
nclust = 1035,
depth = 1,
quant.err = 0.2,
projection.scale = 10,
normalize = T,
distance_metric = "L1_Norm",
error_metric = "max",
quant_method = "kmeans",
diagnose = TRUE,
hvt_validation = TRUE,
train_validation_split_ratio=0.8)# Voronoi tessellation plot for level one
muHVT::plotHVT(hvt.results,
line.width = c(0.6 ),
color.vec = c("#141B41"),
centroid.size = 1,
maxDepth = 1)The table below shows number of cells having quantization error below threshold and percentage of cells having quantization error below threshold.
compressionSummaryTable(hvt.results[[3]]$compression_summary)As seen in the above table, 95% of the cells have a quantization
error below the threshold. Let’s have a closer look at
Quant.Error variable in the below table. The values for
Quant.Error of the cells which have hit the Quantization
Error threshold are shown in red. Here we are showing just top 50 rows
for the sake of brevity.
summaryTable(hvt.results[[3]][['summary']])| Segment.Level | Segment.Parent | Segment.Child | n | Quant.Error | variable1 | variable2 | variable3 | variable4 | variable5 |
|---|---|---|---|---|---|---|---|---|---|
| 1 | 1 | 1 | 1 | 0 | 0.03 | -0.35 | 0.34 | -1.49 | 0.33 |
| 1 | 1 | 2 | 1 | 0 | 0.33 | 0.19 | -0.14 | 0.47 | -2.52 |
| 1 | 1 | 3 | 2 | 0.1 | 0.83 | -0.49 | -0.33 | 0.27 | 0.62 |
| 1 | 1 | 4 | 1 | 0 | -0.33 | 0.16 | -0.31 | -0.41 | 0.26 |
| 1 | 1 | 5 | 2 | 0.14 | -1.09 | -1.35 | 0.51 | 0.67 | -0.91 |
| 1 | 1 | 6 | 2 | 0.12 | 1.13 | 0.91 | 1.90 | 0.56 | 1.07 |
| 1 | 1 | 7 | 2 | 0.15 | 0.42 | -0.98 | 1.89 | 0.21 | -1.25 |
| 1 | 1 | 8 | 1 | 0 | -0.18 | -1.99 | -0.19 | 1.63 | 0.54 |
| 1 | 1 | 9 | 2 | 0.12 | 0.10 | -0.45 | 1.19 | 1.26 | -1.25 |
| 1 | 1 | 10 | 1 | 0 | 0.11 | 0.00 | -1.17 | -0.43 | -0.33 |
| 1 | 1 | 11 | 2 | 0.11 | -1.51 | 0.45 | 0.34 | -0.48 | -0.71 |
| 1 | 1 | 12 | 1 | 0 | 0.94 | 0.24 | -2.07 | -1.46 | -2.15 |
| 1 | 1 | 13 | 2 | 0.12 | 0.77 | 1.04 | 0.13 | -0.22 | 1.59 |
| 1 | 1 | 14 | 1 | 0 | 1.13 | -0.64 | -0.59 | -1.93 | -0.74 |
| 1 | 1 | 15 | 1 | 0 | 0.43 | 0.51 | 1.95 | -0.26 | 0.58 |
| 1 | 1 | 16 | 1 | 0 | -1.14 | -0.05 | 1.11 | 2.20 | 1.83 |
| 1 | 1 | 17 | 1 | 0 | 1.48 | 0.60 | -0.41 | -0.51 | 2.51 |
| 1 | 1 | 18 | 3 | 0.16 | -1.53 | 0.60 | -0.18 | 0.36 | -0.34 |
| 1 | 1 | 19 | 2 | 0.08 | -0.58 | -0.86 | 0.70 | -1.32 | 0.35 |
| 1 | 1 | 20 | 1 | 0 | 1.96 | 2.61 | -0.32 | -0.79 | 0.77 |
| 1 | 1 | 21 | 1 | 0 | 0.32 | -1.55 | -0.33 | 0.57 | 0.09 |
| 1 | 1 | 22 | 1 | 0 | -0.54 | -0.56 | 0.00 | 0.33 | -0.25 |
| 1 | 1 | 23 | 1 | 0 | -1.37 | -0.63 | -0.33 | -0.96 | 1.86 |
| 1 | 1 | 24 | 1 | 0 | 1.19 | 0.45 | 3.79 | 0.37 | 0.55 |
| 1 | 1 | 25 | 1 | 0 | -0.81 | -1.32 | -0.35 | 0.18 | -1.47 |
| 1 | 1 | 26 | 2 | 0.09 | 0.20 | 1.06 | -1.52 | -0.31 | 0.07 |
| 1 | 1 | 27 | 1 | 0 | -1.03 | 0.75 | 2.30 | 0.69 | 0.65 |
| 1 | 1 | 28 | 1 | 0 | -1.03 | 0.64 | -1.48 | -0.43 | 0.17 |
| 1 | 1 | 29 | 1 | 0 | -1.12 | -2.50 | -0.24 | 1.26 | -0.66 |
| 1 | 1 | 30 | 1 | 0 | -0.96 | 0.04 | 0.06 | -1.80 | 1.50 |
| 1 | 1 | 31 | 1 | 0 | -1.81 | -1.79 | 1.58 | -0.50 | 0.17 |
| 1 | 1 | 32 | 1 | 0 | 0.79 | -1.03 | -0.25 | 1.79 | 1.09 |
| 1 | 1 | 33 | 1 | 0 | 1.62 | -0.52 | 1.39 | 0.58 | 0.00 |
| 1 | 1 | 34 | 3 | 0.14 | 0.27 | -0.23 | 1.45 | 0.81 | -0.02 |
| 1 | 1 | 35 | 3 | 0.2 | 1.13 | 1.26 | -0.52 | 1.27 | -0.97 |
| 1 | 1 | 36 | 1 | 0 | -1.11 | 0.63 | -1.08 | -1.08 | -2.63 |
| 1 | 1 | 37 | 2 | 0.11 | 0.32 | -0.46 | -1.56 | 0.30 | -0.28 |
| 1 | 1 | 38 | 1 | 0 | 0.36 | -0.42 | 0.41 | 1.36 | -1.74 |
| 1 | 1 | 39 | 1 | 0 | -0.72 | -0.22 | -0.09 | -0.80 | 1.71 |
| 1 | 1 | 40 | 2 | 0.1 | -1.27 | -1.42 | -1.26 | 2.48 | -0.30 |
| 1 | 1 | 41 | 1 | 0 | -1.00 | 0.83 | -0.35 | -0.96 | -0.27 |
| 1 | 1 | 42 | 1 | 0 | 0.73 | -0.18 | 0.57 | 0.11 | 0.01 |
| 1 | 1 | 43 | 1 | 0 | 0.40 | 0.61 | -0.76 | -0.82 | -1.98 |
| 1 | 1 | 44 | 3 | 0.12 | -0.20 | -0.67 | 0.40 | 0.79 | -0.59 |
| 1 | 1 | 45 | 2 | 0.12 | 0.83 | 0.53 | -0.18 | 1.55 | -1.40 |
| 1 | 1 | 46 | 2 | 0.15 | -0.96 | 0.42 | -1.67 | 0.36 | -1.55 |
| 1 | 1 | 47 | 1 | 0 | -1.32 | -0.25 | 1.27 | -1.56 | 0.86 |
| 1 | 1 | 48 | 2 | 0.11 | -0.24 | 0.15 | -0.24 | 0.18 | 1.29 |
| 1 | 1 | 49 | 1 | 0 | 0.58 | -0.06 | -0.69 | 1.99 | 1.29 |
| 1 | 1 | 50 | 3 | 0.11 | 0.02 | 0.56 | -0.58 | 0.64 | -0.57 |
| 1 | 1 | 51 | 2 | 0.11 | 0.27 | 1.41 | 0.38 | 1.28 | 0.53 |
| 1 | 1 | 52 | 3 | 0.15 | 0.50 | 0.13 | 0.78 | 1.85 | -0.75 |
| 1 | 1 | 53 | 1 | 0 | 2.69 | 0.53 | 0.87 | 0.86 | 1.22 |
| 1 | 1 | 54 | 5 | 0.13 | -0.22 | 0.82 | 0.06 | -0.02 | -0.20 |
| 1 | 1 | 55 | 3 | 0.11 | -0.14 | 0.76 | 0.73 | -0.60 | -0.67 |
| 1 | 1 | 56 | 1 | 0 | -0.68 | -0.42 | -0.05 | -0.24 | 1.29 |
| 1 | 1 | 57 | 1 | 0 | 0.91 | -1.94 | -0.01 | 0.66 | 0.94 |
| 1 | 1 | 58 | 1 | 0 | 0.61 | -0.75 | 0.37 | 1.65 | -0.73 |
| 1 | 1 | 59 | 1 | 0 | 0.57 | -2.13 | 1.10 | -1.08 | -1.65 |
| 1 | 1 | 60 | 1 | 0 | 0.29 | 0.35 | 1.53 | 0.46 | 2.43 |
| 1 | 1 | 61 | 1 | 0 | -1.02 | -0.05 | -0.19 | 1.12 | -0.16 |
| 1 | 1 | 62 | 1 | 0 | 0.26 | 0.16 | 0.46 | 0.98 | -0.78 |
| 1 | 1 | 63 | 2 | 0.06 | -1.09 | -0.45 | -0.55 | -0.77 | 1.04 |
| 1 | 1 | 64 | 1 | 0 | -1.82 | -1.85 | 0.04 | -1.27 | -1.41 |
| 1 | 1 | 65 | 3 | 0.19 | 0.34 | 0.57 | -2.66 | 0.48 | 0.21 |
| 1 | 1 | 66 | 1 | 0 | 0.69 | -1.55 | 0.07 | 0.72 | 0.16 |
| 1 | 1 | 67 | 1 | 0 | -0.67 | -0.68 | 0.61 | 0.19 | 0.61 |
| 1 | 1 | 68 | 2 | 0.14 | -0.07 | 1.81 | -0.73 | -0.73 | -1.06 |
| 1 | 1 | 69 | 1 | 0 | -0.44 | -2.17 | -0.45 | -0.94 | 1.10 |
| 1 | 1 | 70 | 2 | 0.07 | -0.95 | -0.79 | 0.22 | -0.30 | 0.30 |
| 1 | 1 | 71 | 3 | 0.24 | 0.95 | 1.34 | 0.29 | 0.02 | -0.24 |
| 1 | 1 | 72 | 2 | 0.06 | 0.40 | 0.09 | -0.10 | -0.75 | -0.01 |
| 1 | 1 | 73 | 2 | 0.09 | -0.71 | -1.21 | -0.21 | -0.50 | 0.31 |
| 1 | 1 | 74 | 2 | 0.16 | -0.73 | 2.15 | -1.47 | 1.29 | -1.05 |
| 1 | 1 | 75 | 1 | 0 | -0.80 | 1.02 | 1.63 | -2.20 | -1.31 |
| 1 | 1 | 76 | 2 | 0.11 | -1.10 | 0.84 | 0.28 | -1.76 | -0.69 |
| 1 | 1 | 77 | 2 | 0.13 | 1.11 | -0.05 | 0.31 | 0.45 | 1.35 |
| 1 | 1 | 78 | 1 | 0 | 0.11 | 0.05 | 0.84 | -0.76 | -0.28 |
| 1 | 1 | 79 | 2 | 0.13 | -0.47 | -1.31 | 1.38 | -0.54 | -0.83 |
| 1 | 1 | 80 | 2 | 0.26 | 0.33 | -2.36 | 1.64 | 2.10 | 0.17 |
| 1 | 1 | 81 | 2 | 0.14 | 1.33 | 1.68 | 0.43 | 0.82 | -0.45 |
| 1 | 1 | 82 | 1 | 0 | -0.87 | -0.85 | -2.75 | -0.50 | 0.09 |
| 1 | 1 | 83 | 1 | 0 | 1.21 | -0.12 | 0.73 | 2.80 | 0.03 |
| 1 | 1 | 84 | 1 | 0 | -0.32 | -2.03 | -0.51 | 0.52 | -1.54 |
| 1 | 1 | 85 | 1 | 0 | -0.96 | 1.77 | -0.08 | 1.17 | -1.15 |
| 1 | 1 | 86 | 1 | 0 | 0.27 | -0.53 | -0.05 | 0.57 | 0.50 |
| 1 | 1 | 87 | 2 | 0.1 | 0.70 | -1.77 | 0.55 | -0.07 | -0.68 |
| 1 | 1 | 88 | 3 | 0.1 | 0.14 | -0.67 | -0.64 | 0.17 | 0.16 |
| 1 | 1 | 89 | 2 | 0.19 | -1.86 | 0.80 | 0.36 | 1.36 | 2.04 |
| 1 | 1 | 90 | 1 | 0 | 2.00 | 1.99 | 1.05 | -0.98 | 0.42 |
| 1 | 1 | 91 | 1 | 0 | 0.43 | 0.90 | 1.35 | 0.86 | 1.03 |
| 1 | 1 | 92 | 1 | 0 | 1.93 | 0.39 | 1.34 | 1.44 | -0.05 |
| 1 | 1 | 93 | 1 | 0 | 2.24 | -0.09 | 0.77 | 0.12 | 0.14 |
| 1 | 1 | 94 | 2 | 0.15 | 0.55 | 1.86 | -0.07 | 0.95 | 1.45 |
| 1 | 1 | 95 | 3 | 0.17 | -0.49 | 1.07 | -0.74 | 0.98 | -1.62 |
| 1 | 1 | 96 | 1 | 0 | -0.51 | -0.69 | -2.34 | 2.01 | 1.15 |
| 1 | 1 | 97 | 2 | 0.12 | 0.46 | 0.90 | -2.20 | 1.12 | -0.64 |
| 1 | 1 | 98 | 1 | 0 | 0.61 | -1.50 | 1.38 | -0.47 | -0.70 |
| 1 | 1 | 99 | 1 | 0 | -0.50 | -0.67 | -0.14 | 0.38 | -1.27 |
| 1 | 1 | 100 | 2 | 0.07 | 0.15 | 0.16 | 0.58 | 1.21 | 0.63 |
| 1 | 1 | 101 | 2 | 0.09 | -0.42 | 0.48 | 0.63 | -0.38 | -0.85 |
| 1 | 1 | 102 | 2 | 0.06 | -0.77 | -1.35 | -0.35 | -0.91 | 0.40 |
| 1 | 1 | 103 | 1 | 0 | -1.57 | 1.19 | 0.50 | 0.58 | 1.36 |
| 1 | 1 | 104 | 2 | 0.15 | -0.52 | -1.67 | 0.57 | -1.11 | -0.54 |
| 1 | 1 | 105 | 2 | 0.09 | -0.35 | 0.98 | -0.61 | 0.02 | -0.52 |
| 1 | 1 | 106 | 1 | 0 | -1.76 | 1.00 | 0.90 | -1.59 | -1.13 |
| 1 | 1 | 107 | 1 | 0 | 1.41 | 0.63 | -0.48 | -1.53 | 0.54 |
| 1 | 1 | 108 | 1 | 0 | -1.05 | 0.76 | 0.05 | 1.22 | 0.16 |
| 1 | 1 | 109 | 2 | 0.09 | -0.98 | -0.21 | 0.22 | 0.45 | 0.01 |
| 1 | 1 | 110 | 2 | 0.13 | 1.59 | 0.15 | 0.60 | 0.41 | -1.25 |
| 1 | 1 | 111 | 3 | 0.09 | -0.49 | -0.87 | 0.38 | -0.47 | -0.33 |
| 1 | 1 | 112 | 1 | 0 | 1.65 | -2.13 | 0.04 | 0.08 | -1.30 |
| 1 | 1 | 113 | 2 | 0.14 | -0.77 | 0.21 | -1.02 | 0.50 | 1.35 |
| 1 | 1 | 114 | 1 | 0 | 0.40 | -0.58 | 0.73 | 1.32 | 0.03 |
| 1 | 1 | 115 | 1 | 0 | 1.27 | 0.34 | -1.26 | 0.67 | -0.44 |
| 1 | 1 | 116 | 3 | 0.13 | -0.07 | -0.33 | 1.20 | 0.37 | -0.64 |
| 1 | 1 | 117 | 1 | 0 | -0.05 | -1.33 | 0.73 | 0.93 | -0.47 |
| 1 | 1 | 118 | 2 | 0.14 | -1.34 | -0.11 | 0.54 | 1.27 | 0.35 |
| 1 | 1 | 119 | 1 | 0 | 2.60 | 1.30 | 0.23 | -1.94 | -0.07 |
| 1 | 1 | 120 | 2 | 0.08 | 0.00 | -0.51 | -0.46 | 0.54 | -0.69 |
| 1 | 1 | 121 | 1 | 0 | -2.68 | -0.87 | -0.57 | 0.43 | -0.31 |
| 1 | 1 | 122 | 3 | 0.18 | -0.05 | 0.44 | 1.31 | -1.73 | -0.89 |
| 1 | 1 | 123 | 1 | 0 | 0.46 | -0.31 | -0.56 | 1.20 | 0.14 |
| 1 | 1 | 124 | 1 | 0 | -0.13 | 1.39 | 0.90 | -0.33 | -0.64 |
| 1 | 1 | 125 | 2 | 0.09 | -0.45 | -0.74 | 0.13 | -1.07 | 1.02 |
| 1 | 1 | 126 | 2 | 0.09 | 0.75 | -0.89 | 0.01 | 0.14 | -0.96 |
| 1 | 1 | 127 | 1 | 0 | 0.01 | -0.93 | 1.12 | 0.66 | -0.37 |
| 1 | 1 | 128 | 2 | 0.13 | 1.81 | 0.48 | -0.30 | 0.04 | 0.93 |
| 1 | 1 | 129 | 2 | 0.13 | 0.25 | -0.63 | -0.76 | 1.32 | -1.41 |
| 1 | 1 | 130 | 2 | 0.05 | -0.53 | -0.68 | -1.60 | 1.19 | 0.25 |
| 1 | 1 | 131 | 1 | 0 | -1.37 | -0.89 | 2.27 | -0.14 | -1.39 |
| 1 | 1 | 132 | 3 | 0.14 | -0.42 | 2.26 | 0.18 | -0.08 | 0.79 |
| 1 | 1 | 133 | 1 | 0 | 0.95 | -1.42 | -0.20 | -0.09 | 0.99 |
| 1 | 1 | 134 | 1 | 0 | 1.09 | 1.19 | 0.56 | 0.88 | 1.22 |
| 1 | 1 | 135 | 1 | 0 | 0.25 | -0.30 | 0.78 | -0.03 | 0.65 |
| 1 | 1 | 136 | 1 | 0 | 1.50 | 2.38 | 0.22 | 0.87 | -1.40 |
| 1 | 1 | 137 | 2 | 0.18 | 0.39 | -0.84 | -0.98 | -0.43 | 2.68 |
| 1 | 1 | 138 | 1 | 0 | 0.38 | 0.98 | -0.10 | 0.30 | -0.62 |
| 1 | 1 | 139 | 1 | 0 | 0.68 | -0.32 | 1.39 | -0.84 | -0.41 |
| 1 | 1 | 140 | 3 | 0.16 | -0.44 | -0.59 | 0.93 | -1.28 | 1.40 |
| 1 | 1 | 141 | 1 | 0 | -0.72 | 1.03 | -0.50 | 0.99 | -0.11 |
| 1 | 1 | 142 | 2 | 0.06 | 0.55 | 0.15 | -0.28 | 1.64 | 0.54 |
| 1 | 1 | 143 | 3 | 0.09 | 0.41 | -0.64 | -0.03 | 0.06 | 0.52 |
| 1 | 1 | 144 | 1 | 0 | -1.50 | 0.17 | 2.14 | -1.00 | -0.22 |
| 1 | 1 | 145 | 1 | 0 | -1.59 | -1.93 | 0.00 | -0.95 | 0.35 |
| 1 | 1 | 146 | 1 | 0 | -1.63 | -1.70 | 0.83 | 0.86 | -1.43 |
| 1 | 1 | 147 | 2 | 0.1 | -0.78 | -0.14 | -0.92 | 1.63 | -0.66 |
| 1 | 1 | 148 | 1 | 0 | 0.75 | -0.73 | -0.24 | 1.06 | 0.80 |
| 1 | 1 | 149 | 1 | 0 | -0.99 | 0.58 | -1.39 | 0.47 | -0.31 |
| 1 | 1 | 150 | 2 | 0.09 | -0.35 | 1.26 | 0.16 | -0.79 | 0.91 |
| 1 | 1 | 151 | 2 | 0.11 | 1.22 | -0.89 | -0.91 | 0.59 | 0.22 |
| 1 | 1 | 152 | 1 | 0 | 1.42 | -0.75 | 0.79 | -0.19 | -0.38 |
| 1 | 1 | 153 | 2 | 0.07 | -0.51 | -0.19 | -0.49 | 0.62 | -0.02 |
| 1 | 1 | 154 | 2 | 0.18 | 0.22 | 1.39 | -1.11 | 0.87 | 2.45 |
| 1 | 1 | 155 | 1 | 0 | -0.04 | 0.57 | 0.94 | 0.11 | -0.68 |
| 1 | 1 | 156 | 1 | 0 | -1.40 | 1.89 | 0.26 | -2.33 | 0.32 |
| 1 | 1 | 157 | 1 | 0 | -0.99 | 1.52 | 0.49 | -0.64 | 0.28 |
| 1 | 1 | 158 | 2 | 0.18 | 1.24 | 1.06 | -0.90 | 1.16 | 0.55 |
| 1 | 1 | 159 | 1 | 0 | 2.31 | 0.62 | -0.55 | 0.63 | -0.71 |
| 1 | 1 | 160 | 1 | 0 | -0.67 | 1.23 | -1.56 | 0.45 | -0.39 |
| 1 | 1 | 161 | 2 | 0.08 | -0.89 | -0.55 | -0.17 | -0.86 | -0.37 |
| 1 | 1 | 162 | 2 | 0.1 | 1.20 | 0.02 | 0.32 | -0.36 | -0.35 |
| 1 | 1 | 163 | 1 | 0 | -0.03 | 1.30 | 2.19 | 1.00 | -0.40 |
| 1 | 1 | 164 | 1 | 0 | 1.14 | -1.49 | 1.77 | 0.26 | 0.99 |
| 1 | 1 | 165 | 1 | 0 | -1.83 | 0.10 | 1.07 | 0.96 | 0.69 |
| 1 | 1 | 166 | 2 | 0.13 | -0.33 | 2.57 | -0.29 | 0.42 | -0.21 |
| 1 | 1 | 167 | 1 | 0 | -0.85 | 0.93 | -2.26 | -0.51 | -1.87 |
| 1 | 1 | 168 | 1 | 0 | -1.83 | -0.08 | -0.58 | -0.90 | 0.37 |
| 1 | 1 | 169 | 2 | 0.1 | 0.71 | 0.28 | 1.05 | -0.52 | 0.99 |
| 1 | 1 | 170 | 1 | 0 | 0.95 | -0.09 | -0.63 | 3.01 | 1.26 |
| 1 | 1 | 171 | 1 | 0 | -1.61 | 0.11 | 1.45 | 1.76 | 0.04 |
| 1 | 1 | 172 | 1 | 0 | -0.28 | -0.07 | -0.18 | -0.03 | 0.34 |
| 1 | 1 | 173 | 3 | 0.16 | 0.45 | -0.57 | 2.03 | 0.17 | -0.34 |
| 1 | 1 | 174 | 1 | 0 | 1.20 | -0.23 | 0.50 | 1.73 | -1.12 |
| 1 | 1 | 175 | 2 | 0.13 | -0.32 | -2.21 | -0.77 | 0.32 | -0.35 |
| 1 | 1 | 176 | 1 | 0 | -0.29 | -0.24 | -0.18 | 0.68 | -0.03 |
| 1 | 1 | 177 | 1 | 0 | 0.78 | 1.02 | 0.00 | 0.91 | -0.10 |
| 1 | 1 | 178 | 1 | 0 | 1.07 | -0.48 | 1.55 | 0.16 | 0.49 |
| 1 | 1 | 179 | 2 | 0.1 | -1.84 | 0.17 | 0.16 | -0.58 | 0.18 |
| 1 | 1 | 180 | 3 | 0.15 | -0.57 | -0.64 | -0.74 | -0.50 | -1.00 |
| 1 | 1 | 181 | 2 | 0.14 | 0.05 | 1.87 | -0.28 | -1.90 | 0.50 |
| 1 | 1 | 182 | 2 | 0.11 | -0.09 | 0.01 | 0.07 | -1.45 | 0.93 |
| 1 | 1 | 183 | 1 | 0 | 1.20 | -0.82 | 0.29 | 1.33 | 1.72 |
| 1 | 1 | 184 | 3 | 0.1 | 0.76 | -0.31 | 0.02 | 1.02 | 1.00 |
| 1 | 1 | 185 | 2 | 0.15 | 1.28 | 0.58 | 1.74 | 0.61 | 0.38 |
| 1 | 1 | 186 | 1 | 0 | 0.55 | 0.15 | -0.77 | 0.20 | -1.05 |
| 1 | 1 | 187 | 1 | 0 | -0.01 | 0.47 | -0.09 | -1.78 | 2.00 |
| 1 | 1 | 188 | 1 | 0 | 0.74 | -1.56 | -1.40 | -1.80 | -2.88 |
| 1 | 1 | 189 | 1 | 0 | -0.83 | 0.42 | -0.09 | 1.84 | 1.54 |
| 1 | 1 | 190 | 3 | 0.17 | 1.96 | -1.64 | 0.44 | 1.26 | -0.05 |
| 1 | 1 | 191 | 1 | 0 | -1.05 | -1.13 | -0.73 | 0.02 | -2.68 |
| 1 | 1 | 192 | 1 | 0 | -2.32 | 0.52 | -0.85 | -0.41 | 0.89 |
| 1 | 1 | 193 | 1 | 0 | 0.41 | -1.18 | 0.77 | 1.10 | 0.82 |
| 1 | 1 | 194 | 1 | 0 | -0.02 | 1.52 | -2.07 | -2.32 | 1.07 |
| 1 | 1 | 195 | 2 | 0.15 | -0.77 | 1.71 | -0.28 | -0.06 | 1.99 |
| 1 | 1 | 196 | 2 | 0.1 | 0.67 | -0.40 | 0.00 | 0.55 | -1.89 |
| 1 | 1 | 197 | 1 | 0 | -0.55 | 0.03 | 1.66 | 0.70 | 0.28 |
| 1 | 1 | 198 | 2 | 0.11 | 1.82 | -0.74 | 0.43 | 0.46 | -0.09 |
| 1 | 1 | 199 | 1 | 0 | 0.23 | -0.80 | -2.20 | -1.06 | 0.25 |
| 1 | 1 | 200 | 3 | 0.16 | -0.64 | -0.08 | -1.06 | 0.12 | 0.65 |
| 1 | 1 | 201 | 1 | 0 | -2.47 | -0.34 | -1.68 | 0.25 | -1.06 |
| 1 | 1 | 202 | 2 | 0.12 | -0.64 | 0.37 | 0.29 | 1.31 | 0.96 |
| 1 | 1 | 203 | 1 | 0 | -1.29 | 1.27 | 2.81 | -0.94 | -0.36 |
| 1 | 1 | 204 | 1 | 0 | 0.30 | -0.12 | -1.52 | -0.77 | -2.07 |
| 1 | 1 | 205 | 2 | 0.09 | 0.91 | 0.44 | -0.05 | -0.69 | -2.24 |
| 1 | 1 | 206 | 1 | 0 | -0.65 | 1.35 | -1.44 | -0.91 | -2.07 |
| 1 | 1 | 207 | 3 | 0.12 | 0.52 | 0.61 | 0.57 | 0.17 | -0.26 |
| 1 | 1 | 208 | 2 | 0.09 | -0.80 | -0.20 | 1.75 | 0.28 | -1.82 |
| 1 | 1 | 209 | 2 | 0.07 | -0.75 | -0.07 | 0.48 | -0.82 | -0.48 |
| 1 | 1 | 210 | 1 | 0 | 1.82 | 1.18 | 1.83 | -0.22 | 1.54 |
| 1 | 1 | 211 | 2 | 0.08 | 0.23 | -0.66 | 0.79 | -0.81 | 0.31 |
| 1 | 1 | 212 | 2 | 0.13 | 1.03 | -0.02 | -1.42 | -0.02 | 0.62 |
| 1 | 1 | 213 | 2 | 0.12 | -0.95 | -0.70 | 1.16 | -0.92 | 1.01 |
| 1 | 1 | 214 | 1 | 0 | 1.47 | -1.09 | 0.72 | -1.15 | -1.41 |
| 1 | 1 | 215 | 2 | 0.12 | -0.74 | 0.99 | -0.60 | 0.59 | -1.22 |
| 1 | 1 | 216 | 1 | 0 | 0.21 | -0.07 | -0.36 | -0.05 | 0.98 |
| 1 | 1 | 217 | 1 | 0 | 1.38 | 0.35 | 0.82 | -0.35 | 0.29 |
| 1 | 1 | 218 | 3 | 0.08 | -0.48 | -0.23 | 0.48 | 0.36 | -0.97 |
| 1 | 1 | 219 | 2 | 0.12 | 0.39 | 1.63 | -1.12 | 1.82 | -1.01 |
| 1 | 1 | 220 | 2 | 0.12 | 1.19 | -0.11 | 1.99 | -1.28 | -0.39 |
| 1 | 1 | 221 | 4 | 0.18 | -0.55 | -0.35 | 0.98 | -0.01 | 0.72 |
| 1 | 1 | 222 | 2 | 0.09 | -0.10 | -0.11 | -0.70 | -1.72 | 1.28 |
| 1 | 1 | 223 | 3 | 0.15 | -0.18 | 0.48 | -0.61 | -1.29 | -0.56 |
| 1 | 1 | 224 | 2 | 0.12 | -0.07 | 1.94 | 0.76 | -0.58 | 0.41 |
| 1 | 1 | 225 | 1 | 0 | -0.18 | -0.40 | 1.86 | 0.59 | 0.93 |
| 1 | 1 | 226 | 1 | 0 | 0.32 | 0.23 | -0.79 | 0.29 | -1.55 |
| 1 | 1 | 227 | 2 | 0.16 | 1.46 | -0.69 | -0.22 | 1.25 | -0.63 |
| 1 | 1 | 228 | 1 | 0 | 0.19 | -0.35 | 2.24 | 2.59 | -1.40 |
| 1 | 1 | 229 | 2 | 0.08 | 0.89 | -0.26 | 0.52 | 0.92 | -0.77 |
| 1 | 1 | 230 | 2 | 0.12 | -2.37 | -1.64 | -0.20 | -1.47 | 0.07 |
| 1 | 1 | 231 | 1 | 0 | 1.49 | 1.77 | -0.83 | 0.40 | -0.10 |
| 1 | 1 | 232 | 1 | 0 | -1.08 | -0.44 | 1.04 | 0.06 | -0.23 |
| 1 | 1 | 233 | 1 | 0 | -0.17 | -0.50 | 1.37 | 1.16 | 0.79 |
| 1 | 1 | 234 | 2 | 0.14 | -1.20 | 0.15 | -0.58 | -0.08 | -3.11 |
| 1 | 1 | 235 | 1 | 0 | 0.27 | -0.76 | 0.63 | 0.29 | 0.84 |
| 1 | 1 | 236 | 3 | 0.13 | -0.72 | 0.29 | 0.13 | -0.83 | 1.55 |
| 1 | 1 | 237 | 1 | 0 | -0.12 | 1.79 | -0.64 | -1.80 | 1.33 |
| 1 | 1 | 238 | 1 | 0 | -1.04 | 2.12 | -0.07 | -0.65 | 0.97 |
| 1 | 1 | 239 | 1 | 0 | 0.53 | 0.63 | 0.26 | -1.20 | 0.30 |
| 1 | 1 | 240 | 1 | 0 | 2.21 | -0.19 | -0.80 | 1.58 | -0.67 |
| 1 | 1 | 241 | 3 | 0.11 | -0.43 | 0.31 | 0.62 | 0.10 | -0.19 |
| 1 | 1 | 242 | 1 | 0 | -2.70 | -0.49 | -0.64 | 0.24 | 0.53 |
| 1 | 1 | 243 | 2 | 0.11 | -0.64 | 0.17 | 0.37 | -0.89 | 0.25 |
| 1 | 1 | 244 | 1 | 0 | 3.00 | -1.26 | -1.56 | 0.34 | 0.41 |
| 1 | 1 | 245 | 2 | 0.07 | 0.79 | -0.20 | -0.61 | 0.89 | 0.15 |
| 1 | 1 | 246 | 2 | 0.12 | 0.70 | -1.17 | -0.44 | -0.46 | -0.24 |
| 1 | 1 | 247 | 1 | 0 | 0.47 | -1.15 | -0.45 | 0.08 | 0.70 |
| 1 | 1 | 248 | 1 | 0 | -0.53 | 1.81 | 1.19 | -0.99 | -2.40 |
| 1 | 1 | 249 | 2 | 0.1 | -0.78 | 1.36 | -1.71 | -1.09 | -0.91 |
| 1 | 1 | 250 | 2 | 0.09 | -1.33 | -0.58 | -0.02 | 0.56 | -0.47 |
| 1 | 1 | 251 | 2 | 0.13 | -1.09 | -1.30 | 1.66 | -1.16 | -0.55 |
| 1 | 1 | 252 | 1 | 0 | 0.31 | -1.06 | 0.87 | -2.22 | -0.15 |
| 1 | 1 | 253 | 1 | 0 | -0.36 | 1.21 | 1.29 | -0.79 | 1.66 |
| 1 | 1 | 254 | 2 | 0.1 | 1.01 | -0.08 | 1.12 | 0.68 | -0.84 |
| 1 | 1 | 255 | 1 | 0 | -0.67 | -0.49 | -0.51 | -0.20 | 2.70 |
| 1 | 1 | 256 | 2 | 0.05 | -0.20 | -0.08 | -0.67 | -0.42 | 0.34 |
| 1 | 1 | 257 | 1 | 0 | 0.47 | -1.80 | 1.32 | -0.55 | 1.67 |
| 1 | 1 | 258 | 1 | 0 | -0.20 | 1.42 | 1.39 | 0.24 | 0.45 |
| 1 | 1 | 259 | 1 | 0 | -0.11 | -0.86 | 1.06 | 0.64 | 0.86 |
| 1 | 1 | 260 | 2 | 0.1 | 1.11 | 0.37 | -1.48 | -0.95 | -0.78 |
| 1 | 1 | 261 | 2 | 0.1 | 0.16 | 0.19 | -1.00 | 1.03 | 0.74 |
| 1 | 1 | 262 | 1 | 0 | 0.06 | 0.26 | -0.97 | -1.44 | 1.05 |
| 1 | 1 | 263 | 1 | 0 | 2.75 | 0.36 | -0.93 | -1.35 | -0.75 |
| 1 | 1 | 264 | 2 | 0.06 | 1.87 | 0.33 | -0.26 | 0.07 | -0.74 |
| 1 | 1 | 265 | 2 | 0.15 | -1.99 | 1.12 | -0.93 | 1.12 | 0.09 |
| 1 | 1 | 266 | 2 | 0.14 | 0.29 | -1.80 | 0.78 | 0.36 | 0.93 |
| 1 | 1 | 267 | 2 | 0.11 | -0.49 | 1.36 | -0.63 | -0.39 | -0.91 |
| 1 | 1 | 268 | 2 | 0.14 | 1.05 | -0.48 | 1.46 | 1.08 | 1.45 |
| 1 | 1 | 269 | 2 | 0.12 | 1.42 | -2.07 | 0.19 | 0.05 | 1.01 |
| 1 | 1 | 270 | 1 | 0 | 0.82 | -0.65 | -0.48 | 0.84 | -0.58 |
| 1 | 1 | 271 | 3 | 0.19 | 1.85 | 0.38 | 0.61 | 1.22 | 0.19 |
| 1 | 1 | 272 | 2 | 0.15 | -2.17 | -0.84 | -1.30 | -0.34 | -1.49 |
| 1 | 1 | 273 | 1 | 0 | 0.49 | -3.08 | -0.20 | 0.02 | 1.06 |
| 1 | 1 | 274 | 1 | 0 | -2.12 | -2.57 | 0.26 | -1.11 | 0.33 |
| 1 | 1 | 275 | 1 | 0 | -0.15 | 0.86 | 0.09 | 1.86 | -2.11 |
| 1 | 1 | 276 | 2 | 0.21 | 1.68 | 2.53 | 0.38 | 0.21 | -0.62 |
| 1 | 1 | 277 | 2 | 0.15 | -1.21 | 0.43 | -1.12 | -1.16 | -0.87 |
| 1 | 1 | 278 | 1 | 0 | 0.35 | 0.06 | 2.53 | -1.88 | 1.11 |
| 1 | 1 | 279 | 2 | 0.16 | 2.02 | -1.21 | 0.61 | -1.18 | -0.57 |
| 1 | 1 | 280 | 1 | 0 | -1.33 | 1.17 | 1.68 | -1.49 | 0.78 |
| 1 | 1 | 281 | 3 | 0.15 | -0.41 | 0.04 | 0.94 | -0.35 | 0.29 |
| 1 | 1 | 282 | 1 | 0 | 0.83 | 2.17 | 0.14 | -1.30 | -2.09 |
| 1 | 1 | 283 | 1 | 0 | 0.74 | -1.99 | -1.52 | -0.03 | -0.71 |
| 1 | 1 | 284 | 1 | 0 | 0.24 | -1.27 | 0.88 | -0.77 | -2.38 |
| 1 | 1 | 285 | 1 | 0 | 1.77 | -0.69 | 1.85 | -0.90 | -1.43 |
| 1 | 1 | 286 | 1 | 0 | -0.69 | 1.61 | 0.74 | 0.28 | 0.09 |
| 1 | 1 | 287 | 1 | 0 | -2.30 | -0.01 | 1.90 | -1.41 | 0.58 |
| 1 | 1 | 288 | 1 | 0 | -1.08 | -1.49 | -0.43 | 1.21 | -0.84 |
| 1 | 1 | 289 | 1 | 0 | 1.80 | 1.03 | -0.36 | 0.12 | -0.67 |
| 1 | 1 | 290 | 1 | 0 | 1.40 | -1.50 | -0.23 | -0.22 | -0.71 |
| 1 | 1 | 291 | 1 | 0 | -0.10 | 0.69 | -0.02 | -1.74 | -0.23 |
| 1 | 1 | 292 | 1 | 0 | -0.12 | -0.23 | 0.60 | -0.18 | -0.68 |
| 1 | 1 | 293 | 1 | 0 | -0.58 | -0.07 | 1.16 | -1.36 | 0.28 |
| 1 | 1 | 294 | 1 | 0 | 0.12 | 0.38 | -0.50 | 0.18 | -0.04 |
| 1 | 1 | 295 | 2 | 0.12 | 1.18 | -0.03 | 1.64 | -0.98 | 0.74 |
| 1 | 1 | 296 | 1 | 0 | 1.18 | 1.40 | 0.25 | 0.84 | -1.53 |
| 1 | 1 | 297 | 1 | 0 | -2.02 | -0.10 | -0.87 | -1.47 | 2.23 |
| 1 | 1 | 298 | 1 | 0 | -1.06 | 0.09 | 0.09 | 0.92 | -1.04 |
| 1 | 1 | 299 | 1 | 0 | -1.92 | 1.30 | -0.14 | -0.18 | 1.31 |
| 1 | 1 | 300 | 4 | 0.15 | 0.45 | -0.60 | -1.07 | -1.20 | 0.87 |
| 1 | 1 | 301 | 2 | 0.1 | -0.12 | 0.11 | -0.07 | 0.88 | -1.12 |
| 1 | 1 | 302 | 2 | 0.06 | 0.30 | -0.13 | -0.51 | -0.49 | 0.13 |
| 1 | 1 | 303 | 2 | 0.1 | -0.15 | 1.71 | -0.16 | 0.75 | 0.81 |
| 1 | 1 | 304 | 2 | 0.1 | -1.29 | 0.17 | 0.01 | -1.58 | -0.44 |
| 1 | 1 | 305 | 2 | 0.17 | -1.34 | -0.87 | 0.72 | -0.56 | -1.66 |
| 1 | 1 | 306 | 1 | 0 | -0.25 | -0.50 | 2.40 | -0.81 | -0.36 |
| 1 | 1 | 307 | 1 | 0 | -1.36 | 0.70 | 0.14 | -1.23 | 0.12 |
| 1 | 1 | 308 | 2 | 0.1 | 1.22 | 1.04 | 0.86 | -0.89 | 1.29 |
| 1 | 1 | 309 | 2 | 0.16 | 1.09 | 0.24 | 0.08 | 0.84 | -2.17 |
| 1 | 1 | 310 | 1 | 0 | -0.90 | 0.43 | -0.34 | 0.71 | -0.04 |
| 1 | 1 | 311 | 2 | 0.18 | 2.60 | 1.65 | 1.78 | 0.61 | -0.25 |
| 1 | 1 | 312 | 2 | 0.08 | -0.50 | -0.36 | 0.52 | -0.58 | -0.61 |
| 1 | 1 | 313 | 2 | 0.12 | -1.55 | 0.09 | -0.07 | -0.19 | 0.96 |
| 1 | 1 | 314 | 1 | 0 | -1.59 | -1.22 | -1.26 | -0.28 | -0.84 |
| 1 | 1 | 315 | 1 | 0 | -0.32 | -0.32 | -0.32 | 1.20 | -0.64 |
| 1 | 1 | 316 | 1 | 0 | -0.34 | 1.47 | -0.13 | 0.17 | 0.38 |
| 1 | 1 | 317 | 2 | 0.09 | 0.76 | -0.54 | 1.33 | -0.80 | 0.41 |
| 1 | 1 | 318 | 1 | 0 | 1.44 | 0.70 | 0.66 | -1.99 | 0.58 |
| 1 | 1 | 319 | 1 | 0 | -0.81 | 2.40 | 0.14 | 0.57 | -0.90 |
| 1 | 1 | 320 | 1 | 0 | -0.21 | -1.15 | -0.78 | 0.41 | 1.12 |
| 1 | 1 | 321 | 1 | 0 | 0.36 | 0.32 | -1.39 | -1.49 | -1.86 |
| 1 | 1 | 322 | 2 | 0.12 | -0.24 | 1.33 | -0.61 | 0.05 | 1.54 |
| 1 | 1 | 323 | 2 | 0.09 | 1.06 | 0.66 | 0.82 | 0.33 | -1.39 |
| 1 | 1 | 324 | 1 | 0 | -1.62 | -0.44 | -0.10 | 0.91 | -0.85 |
| 1 | 1 | 325 | 1 | 0 | -2.52 | 0.85 | -2.22 | -0.40 | 1.06 |
| 1 | 1 | 326 | 1 | 0 | 2.14 | -1.84 | -0.76 | -1.87 | 0.39 |
| 1 | 1 | 327 | 2 | 0.1 | 0.68 | -0.49 | -0.44 | -1.17 | -0.26 |
| 1 | 1 | 328 | 1 | 0 | 1.04 | 0.02 | -2.07 | 2.51 | 0.24 |
| 1 | 1 | 329 | 1 | 0 | -0.33 | 1.01 | -0.57 | -0.70 | -0.20 |
| 1 | 1 | 330 | 1 | 0 | -1.99 | 0.06 | 1.06 | 0.34 | -0.96 |
| 1 | 1 | 331 | 1 | 0 | -1.14 | -0.94 | -1.13 | 0.36 | -0.03 |
| 1 | 1 | 332 | 2 | 0.09 | 0.28 | -1.78 | -0.29 | 1.38 | -0.26 |
| 1 | 1 | 333 | 1 | 0 | 0.69 | 1.26 | -0.32 | 0.01 | 1.27 |
| 1 | 1 | 334 | 1 | 0 | -0.77 | 0.76 | -0.97 | 0.79 | 0.14 |
| 1 | 1 | 335 | 2 | 0.06 | 0.89 | -0.45 | -1.66 | 0.87 | 0.67 |
| 1 | 1 | 336 | 2 | 0.14 | 0.14 | 0.23 | -1.52 | -0.97 | -0.99 |
| 1 | 1 | 337 | 2 | 0.06 | 0.56 | -1.48 | 1.11 | 0.16 | -0.13 |
| 1 | 1 | 338 | 3 | 0.16 | 0.50 | -0.87 | 1.02 | 1.04 | -0.56 |
| 1 | 1 | 339 | 2 | 0.1 | 0.28 | 1.23 | -0.74 | 0.31 | -0.97 |
| 1 | 1 | 340 | 5 | 0.13 | 0.29 | 1.06 | 0.27 | -0.43 | -0.53 |
| 1 | 1 | 341 | 1 | 0 | -0.08 | -0.31 | -0.56 | -2.33 | -0.25 |
| 1 | 1 | 342 | 1 | 0 | -0.73 | 0.62 | 0.36 | -0.95 | 0.92 |
| 1 | 1 | 343 | 1 | 0 | 0.79 | 1.88 | -2.47 | 1.23 | -2.33 |
| 1 | 1 | 344 | 2 | 0.1 | -0.19 | -1.30 | 0.79 | 0.24 | -1.17 |
| 1 | 1 | 345 | 2 | 0.1 | 0.37 | 0.56 | -0.57 | -0.69 | 0.49 |
| 1 | 1 | 346 | 2 | 0.1 | 0.34 | 0.32 | 0.19 | -0.80 | 0.82 |
| 1 | 1 | 347 | 3 | 0.14 | -0.98 | 0.73 | 1.42 | 0.63 | 0.41 |
| 1 | 1 | 348 | 1 | 0 | 1.27 | 0.71 | -0.50 | -2.07 | -1.67 |
| 1 | 1 | 349 | 2 | 0.12 | -1.69 | 0.11 | -0.68 | 0.62 | -0.89 |
| 1 | 1 | 350 | 1 | 0 | 1.01 | -0.36 | -0.54 | 1.12 | 0.70 |
| 1 | 1 | 351 | 2 | 0.09 | -0.47 | 0.47 | -0.77 | -0.42 | 0.55 |
| 1 | 1 | 352 | 2 | 0.1 | -0.49 | 0.47 | 0.89 | 0.13 | 1.42 |
| 1 | 1 | 353 | 1 | 0 | 0.88 | 0.04 | -0.97 | -0.43 | 2.68 |
| 1 | 1 | 354 | 2 | 0.13 | -1.67 | 0.29 | -1.93 | -0.13 | -0.40 |
| 1 | 1 | 355 | 1 | 0 | -0.44 | -0.13 | 1.06 | 0.11 | 1.81 |
| 1 | 1 | 356 | 1 | 0 | -0.07 | 1.11 | 0.04 | -0.09 | -1.42 |
| 1 | 1 | 357 | 2 | 0.08 | 0.29 | -0.94 | -0.95 | -0.71 | -0.87 |
| 1 | 1 | 358 | 2 | 0.14 | 1.47 | -0.64 | -1.71 | -0.74 | 0.97 |
| 1 | 1 | 359 | 1 | 0 | -0.66 | -0.69 | 1.29 | 0.30 | 1.04 |
| 1 | 1 | 360 | 1 | 0 | 0.39 | 1.29 | 0.38 | 1.39 | -0.38 |
| 1 | 1 | 361 | 1 | 0 | 0.48 | 1.46 | 1.16 | -0.04 | -1.26 |
| 1 | 1 | 362 | 1 | 0 | -0.81 | 1.11 | -1.32 | 0.09 | -0.77 |
| 1 | 1 | 363 | 1 | 0 | -0.69 | -0.50 | 1.64 | -0.74 | -0.45 |
| 1 | 1 | 364 | 1 | 0 | 0.77 | -1.12 | 0.39 | 1.05 | -1.40 |
| 1 | 1 | 365 | 1 | 0 | 0.47 | -1.98 | -0.28 | -1.37 | 0.92 |
| 1 | 1 | 366 | 1 | 0 | 2.08 | 0.12 | -1.23 | -0.33 | 0.35 |
| 1 | 1 | 367 | 2 | 0.09 | 0.82 | 1.42 | 0.22 | -1.37 | -0.23 |
| 1 | 1 | 368 | 1 | 0 | -1.37 | -2.80 | 0.62 | -0.14 | 0.41 |
| 1 | 1 | 369 | 1 | 0 | -1.54 | -0.90 | -0.52 | -0.67 | 0.82 |
| 1 | 1 | 370 | 1 | 0 | -1.42 | -1.62 | 0.82 | -0.60 | -0.05 |
| 1 | 1 | 371 | 1 | 0 | -0.54 | 0.21 | -1.04 | -1.14 | 0.26 |
| 1 | 1 | 372 | 3 | 0.12 | -0.42 | 0.51 | 0.97 | -1.24 | -0.10 |
| 1 | 1 | 373 | 2 | 0.15 | -1.47 | 0.42 | -0.05 | -0.06 | -1.40 |
| 1 | 1 | 374 | 1 | 0 | 0.93 | 1.59 | -0.35 | -1.73 | -0.41 |
| 1 | 1 | 375 | 2 | 0.14 | -0.15 | 0.19 | -1.16 | -0.42 | 0.79 |
| 1 | 1 | 376 | 1 | 0 | 0.16 | -0.10 | 1.16 | 1.58 | 0.12 |
| 1 | 1 | 377 | 1 | 0 | 0.86 | 2.45 | 0.07 | -0.18 | 0.78 |
| 1 | 1 | 378 | 1 | 0 | 0.26 | -0.84 | 0.12 | 0.69 | -0.69 |
| 1 | 1 | 379 | 1 | 0 | -0.23 | 0.09 | -1.55 | 0.29 | -0.88 |
| 1 | 1 | 380 | 1 | 0 | 0.02 | 0.58 | 0.66 | 2.39 | -0.62 |
| 1 | 1 | 381 | 1 | 0 | 0.48 | 0.06 | 1.15 | -2.27 | 2.17 |
| 1 | 1 | 382 | 2 | 0.08 | -0.70 | -1.18 | -0.58 | 1.26 | 0.79 |
| 1 | 1 | 383 | 2 | 0.1 | -0.58 | -0.64 | -0.09 | -0.34 | -0.82 |
| 1 | 1 | 384 | 1 | 0 | 0.22 | -0.17 | -3.60 | -0.12 | -0.40 |
| 1 | 1 | 385 | 2 | 0.08 | 1.90 | 0.64 | -0.39 | -0.58 | 1.28 |
| 1 | 1 | 386 | 2 | 0.15 | 0.78 | -0.15 | -0.88 | -0.38 | 1.11 |
| 1 | 1 | 387 | 1 | 0 | -0.68 | -3.37 | 0.98 | 0.29 | -2.16 |
| 1 | 1 | 388 | 1 | 0 | 1.01 | -0.63 | -0.96 | -1.66 | 0.47 |
| 1 | 1 | 389 | 1 | 0 | 2.15 | 0.73 | 1.22 | -0.86 | 0.25 |
| 1 | 1 | 390 | 1 | 0 | 0.48 | -0.07 | 0.96 | 1.33 | 0.38 |
| 1 | 1 | 391 | 2 | 0.06 | 1.28 | 0.85 | -0.25 | 0.29 | -0.13 |
| 1 | 1 | 392 | 2 | 0.1 | -1.54 | -1.34 | 0.21 | -0.17 | 0.46 |
| 1 | 1 | 393 | 1 | 0 | 0.20 | -0.28 | -0.35 | 1.59 | 0.35 |
| 1 | 1 | 394 | 2 | 0.07 | -0.01 | -0.42 | -1.14 | 0.57 | 0.41 |
| 1 | 1 | 395 | 1 | 0 | -0.64 | -3.01 | 2.41 | 1.30 | 0.69 |
| 1 | 1 | 396 | 1 | 0 | -0.60 | 2.18 | 0.61 | 0.45 | 0.83 |
| 1 | 1 | 397 | 1 | 0 | -1.32 | 1.42 | -0.89 | -0.80 | -0.78 |
| 1 | 1 | 398 | 2 | 0.12 | 1.43 | -0.88 | 0.78 | -0.86 | -0.56 |
| 1 | 1 | 399 | 1 | 0 | -0.19 | -0.30 | -0.36 | 1.43 | 1.21 |
| 1 | 1 | 400 | 1 | 0 | -0.44 | -2.22 | 0.69 | 1.74 | -2.47 |
| 1 | 1 | 401 | 1 | 0 | -0.43 | 0.66 | 0.73 | 1.53 | -0.93 |
| 1 | 1 | 402 | 2 | 0.12 | 2.26 | 1.06 | 0.23 | 1.24 | -1.39 |
| 1 | 1 | 403 | 1 | 0 | 0.64 | -2.26 | 0.29 | 0.37 | -0.22 |
| 1 | 1 | 404 | 2 | 0.11 | -0.86 | -0.32 | -0.37 | -1.03 | 0.35 |
| 1 | 1 | 405 | 2 | 0.1 | -0.73 | -0.51 | 0.39 | -0.86 | 0.10 |
| 1 | 1 | 406 | 2 | 0.06 | -0.32 | 0.50 | -0.06 | -0.12 | 0.94 |
| 1 | 1 | 407 | 2 | 0.06 | -0.02 | 0.83 | -0.41 | -0.49 | -0.39 |
| 1 | 1 | 408 | 2 | 0.07 | -0.56 | -1.31 | 0.14 | 0.20 | 1.70 |
| 1 | 1 | 409 | 3 | 0.12 | -0.40 | 0.20 | 0.81 | -0.49 | -1.38 |
| 1 | 1 | 410 | 1 | 0 | 0.18 | -1.31 | 0.05 | 0.35 | 1.31 |
| 1 | 1 | 411 | 1 | 0 | 0.51 | 0.47 | 1.10 | -0.09 | -1.39 |
| 1 | 1 | 412 | 1 | 0 | 1.53 | 2.45 | 0.60 | -0.63 | -1.35 |
| 1 | 1 | 413 | 2 | 0.17 | -0.01 | -2.20 | -1.78 | -0.65 | 0.12 |
| 1 | 1 | 414 | 1 | 0 | 0.21 | -0.32 | -1.58 | -0.71 | 2.08 |
| 1 | 1 | 415 | 1 | 0 | 1.26 | 1.06 | 1.29 | 0.45 | -0.97 |
| 1 | 1 | 416 | 4 | 0.16 | 0.23 | 0.82 | -0.75 | 0.70 | 0.84 |
| 1 | 1 | 417 | 1 | 0 | 1.38 | -1.20 | 0.52 | -1.26 | 1.21 |
| 1 | 1 | 418 | 2 | 0.17 | 0.22 | 0.17 | 0.30 | -1.08 | 2.45 |
| 1 | 1 | 419 | 1 | 0 | 0.93 | -0.89 | -1.67 | 0.95 | 0.41 |
| 1 | 1 | 420 | 1 | 0 | 0.31 | 0.77 | 0.51 | -0.04 | 1.58 |
| 1 | 1 | 421 | 2 | 0.13 | -0.65 | 1.23 | -2.53 | -1.01 | -0.86 |
| 1 | 1 | 422 | 1 | 0 | -0.89 | -1.54 | 0.81 | 0.91 | 0.26 |
| 1 | 1 | 423 | 1 | 0 | -0.15 | -0.09 | -1.56 | 0.78 | -0.31 |
| 1 | 1 | 424 | 1 | 0 | -1.31 | 1.30 | -0.93 | -1.07 | -0.23 |
| 1 | 1 | 425 | 1 | 0 | 1.01 | -1.86 | 1.02 | 0.42 | 0.08 |
| 1 | 1 | 426 | 2 | 0.08 | 0.66 | -1.01 | 0.58 | 0.29 | -0.14 |
| 1 | 1 | 427 | 1 | 0 | -0.10 | -0.23 | -1.74 | -0.70 | -3.18 |
| 1 | 1 | 428 | 1 | 0 | 0.62 | -0.82 | -1.22 | -0.74 | 0.40 |
| 1 | 1 | 429 | 3 | 0.12 | 0.64 | -1.08 | -0.70 | 0.32 | 0.32 |
| 1 | 1 | 430 | 1 | 0 | -1.14 | -3.17 | -2.20 | -0.07 | 2.67 |
| 1 | 1 | 431 | 1 | 0 | -0.39 | 0.35 | 2.12 | -1.10 | 0.11 |
| 1 | 1 | 432 | 1 | 0 | -1.60 | -1.11 | -2.09 | -0.25 | -0.46 |
| 1 | 1 | 433 | 1 | 0 | -1.14 | 0.51 | 1.13 | 1.06 | 1.00 |
| 1 | 1 | 434 | 2 | 0.16 | -0.25 | 0.29 | 1.35 | 0.39 | -2.17 |
| 1 | 1 | 435 | 2 | 0.07 | -0.20 | 0.54 | -0.73 | -0.44 | -0.49 |
| 1 | 1 | 436 | 2 | 0.06 | 1.89 | -0.01 | 1.30 | -0.23 | 0.21 |
| 1 | 1 | 437 | 1 | 0 | -0.30 | -0.42 | 1.89 | -0.85 | -0.42 |
| 1 | 1 | 438 | 1 | 0 | -0.24 | 1.45 | 0.07 | -1.30 | -0.43 |
| 1 | 1 | 439 | 1 | 0 | -1.18 | 0.56 | -0.79 | 1.74 | -0.99 |
| 1 | 1 | 440 | 1 | 0 | 1.74 | 1.28 | 0.39 | 0.35 | 1.45 |
| 1 | 1 | 441 | 2 | 0.16 | 0.86 | -0.66 | -1.31 | 1.96 | -0.38 |
| 1 | 1 | 442 | 1 | 0 | -1.25 | -0.77 | 0.86 | -0.40 | 0.07 |
| 1 | 1 | 443 | 1 | 0 | -0.70 | -1.40 | 1.05 | 1.23 | -0.31 |
| 1 | 1 | 444 | 2 | 0.15 | 0.90 | 2.02 | -0.41 | -0.71 | -1.15 |
| 1 | 1 | 445 | 1 | 0 | -1.83 | 1.16 | -0.52 | -0.78 | -1.76 |
| 1 | 1 | 446 | 1 | 0 | -0.33 | 0.34 | 3.09 | 2.27 | -2.18 |
| 1 | 1 | 447 | 3 | 0.14 | 1.24 | -1.17 | 0.84 | 0.36 | 0.72 |
| 1 | 1 | 448 | 1 | 0 | 0.22 | -1.26 | 1.70 | -1.40 | -1.00 |
| 1 | 1 | 449 | 2 | 0.12 | -1.03 | 0.89 | -1.37 | -1.11 | 1.29 |
| 1 | 1 | 450 | 1 | 0 | 1.14 | 1.54 | 1.20 | -0.89 | 0.17 |
| 1 | 1 | 451 | 2 | 0.1 | 0.46 | 0.48 | 0.77 | -0.25 | 0.57 |
| 1 | 1 | 452 | 2 | 0.11 | 0.02 | -1.20 | 0.43 | 0.12 | -0.40 |
| 1 | 1 | 453 | 1 | 0 | -0.59 | 1.01 | 0.69 | 1.83 | 0.01 |
| 1 | 1 | 454 | 1 | 0 | 1.69 | -1.40 | 1.52 | 1.89 | 0.87 |
| 1 | 1 | 455 | 1 | 0 | -0.64 | 2.26 | 2.77 | 0.72 | -1.40 |
| 1 | 1 | 456 | 1 | 0 | -0.39 | 1.63 | 1.65 | 1.66 | 0.49 |
| 1 | 1 | 457 | 1 | 0 | 3.04 | -0.88 | 1.69 | 0.30 | 1.50 |
| 1 | 1 | 458 | 1 | 0 | -0.70 | 2.93 | 0.07 | 0.45 | 0.66 |
| 1 | 1 | 459 | 1 | 0 | -0.52 | -1.15 | -0.90 | 0.13 | -0.35 |
| 1 | 1 | 460 | 2 | 0.07 | -0.11 | 0.29 | -0.35 | -0.70 | -1.03 |
| 1 | 1 | 461 | 1 | 0 | 0.49 | -0.30 | 1.12 | -1.69 | -0.77 |
| 1 | 1 | 462 | 1 | 0 | 0.24 | 0.03 | 2.16 | 0.97 | 2.30 |
| 1 | 1 | 463 | 1 | 0 | 1.64 | 0.07 | -1.12 | 0.19 | 1.14 |
| 1 | 1 | 464 | 1 | 0 | -0.88 | -0.68 | 0.60 | 0.02 | -0.16 |
| 1 | 1 | 465 | 1 | 0 | 0.76 | 0.57 | 1.25 | -1.43 | 1.51 |
| 1 | 1 | 466 | 1 | 0 | -0.60 | 0.54 | -0.01 | -2.26 | 0.39 |
| 1 | 1 | 467 | 2 | 0.14 | 1.80 | 0.18 | -0.18 | -0.55 | -1.64 |
| 1 | 1 | 468 | 2 | 0.12 | 1.16 | -0.82 | -0.79 | -1.34 | -1.35 |
| 1 | 1 | 469 | 2 | 0.11 | -0.24 | -0.03 | 0.04 | 1.53 | 0.54 |
| 1 | 1 | 470 | 2 | 0.05 | -0.25 | -0.45 | 0.53 | -0.59 | 0.12 |
| 1 | 1 | 471 | 1 | 0 | -0.31 | -1.32 | -1.34 | 1.60 | 2.09 |
| 1 | 1 | 472 | 1 | 0 | 0.42 | -0.46 | 0.92 | 0.07 | -0.47 |
| 1 | 1 | 473 | 1 | 0 | 1.51 | -0.92 | 1.05 | 0.65 | -0.46 |
| 1 | 1 | 474 | 1 | 0 | -1.59 | -0.64 | 1.16 | -0.99 | -2.42 |
| 1 | 1 | 475 | 3 | 0.08 | -0.40 | -0.22 | 0.03 | 0.07 | -0.14 |
| 1 | 1 | 476 | 2 | 0.1 | 0.24 | -0.60 | 0.49 | -0.13 | 0.04 |
| 1 | 1 | 477 | 2 | 0.12 | -0.55 | 1.68 | -0.86 | -0.89 | 0.26 |
| 1 | 1 | 478 | 1 | 0 | 0.86 | 0.59 | 0.07 | -0.95 | 0.12 |
| 1 | 1 | 479 | 2 | 0.13 | -0.14 | -1.23 | -1.66 | 0.00 | 0.13 |
| 1 | 1 | 480 | 2 | 0.11 | 0.77 | -0.06 | 0.04 | -1.18 | -0.80 |
| 1 | 1 | 481 | 3 | 0.16 | 0.02 | -0.15 | 0.18 | -1.19 | -0.50 |
| 1 | 1 | 482 | 1 | 0 | 0.19 | -0.84 | -0.23 | 0.69 | -0.13 |
| 1 | 1 | 483 | 1 | 0 | -2.84 | 1.38 | -1.84 | -0.81 | -1.42 |
| 1 | 1 | 484 | 1 | 0 | -0.05 | 1.21 | -1.78 | 1.50 | -0.91 |
| 1 | 1 | 485 | 1 | 0 | -1.22 | 1.01 | -0.69 | -0.70 | -0.80 |
| 1 | 1 | 486 | 2 | 0.12 | 0.80 | 0.86 | -0.87 | 0.44 | -0.20 |
| 1 | 1 | 487 | 1 | 0 | -1.19 | -0.82 | -0.76 | 0.39 | -0.67 |
| 1 | 1 | 488 | 1 | 0 | -0.86 | 0.06 | -0.28 | 0.67 | -0.20 |
| 1 | 1 | 489 | 1 | 0 | 0.31 | 0.34 | 0.17 | -0.05 | -0.88 |
| 1 | 1 | 490 | 2 | 0.12 | 0.27 | 0.64 | -0.43 | -0.49 | 1.62 |
| 1 | 1 | 491 | 2 | 0.13 | -0.41 | -0.05 | -0.45 | -1.79 | -1.72 |
| 1 | 1 | 492 | 1 | 0 | -2.14 | 0.28 | -0.02 | -0.75 | -0.97 |
| 1 | 1 | 493 | 1 | 0 | 2.54 | 0.84 | 0.06 | -0.37 | -0.62 |
| 1 | 1 | 494 | 1 | 0 | 0.67 | 0.42 | -0.99 | 1.65 | -0.21 |
| 1 | 1 | 495 | 1 | 0 | -0.23 | 1.02 | 1.83 | -0.24 | -0.90 |
| 1 | 1 | 496 | 1 | 0 | -2.19 | 0.98 | 0.04 | -0.17 | -0.81 |
| 1 | 1 | 497 | 2 | 0.11 | -0.75 | 0.67 | -0.19 | 0.05 | 0.56 |
| 1 | 1 | 498 | 2 | 0.05 | -0.16 | 1.06 | 0.65 | 0.16 | -0.07 |
| 1 | 1 | 499 | 1 | 0 | -0.54 | -0.51 | -0.06 | 1.05 | 0.73 |
| 1 | 1 | 500 | 1 | 0 | -0.87 | -1.79 | -0.11 | 0.00 | 1.63 |
| 1 | 1 | 501 | 1 | 0 | -3.30 | -0.48 | 1.61 | 1.12 | -1.88 |
| 1 | 1 | 502 | 3 | 0.16 | 0.48 | 0.52 | 0.05 | 1.40 | 1.73 |
| 1 | 1 | 503 | 1 | 0 | 1.36 | 0.73 | 0.03 | 0.58 | -0.66 |
| 1 | 1 | 504 | 1 | 0 | -0.74 | 0.42 | 0.36 | 1.92 | 0.18 |
| 1 | 1 | 505 | 1 | 0 | -1.56 | 1.17 | 1.06 | -0.78 | 0.05 |
| 1 | 1 | 506 | 2 | 0.13 | 1.28 | 0.28 | 1.17 | 0.20 | -0.40 |
| 1 | 1 | 507 | 1 | 0 | 1.12 | 1.08 | -0.59 | -2.10 | 1.19 |
| 1 | 1 | 508 | 1 | 0 | 0.07 | 0.37 | 0.48 | 3.08 | 1.09 |
| 1 | 1 | 509 | 2 | 0.09 | -2.07 | 0.65 | 1.12 | -0.05 | -0.16 |
| 1 | 1 | 510 | 2 | 0.07 | -0.03 | -0.33 | 0.57 | 0.07 | -0.30 |
| 1 | 1 | 511 | 1 | 0 | -0.84 | 0.39 | 0.19 | -0.77 | -1.44 |
| 1 | 1 | 512 | 2 | 0.09 | 0.36 | 0.77 | -1.18 | 0.63 | -0.83 |
| 1 | 1 | 513 | 1 | 0 | 0.89 | -0.61 | -0.05 | 0.56 | -0.32 |
| 1 | 1 | 514 | 1 | 0 | -0.56 | 1.55 | -1.24 | 1.25 | 0.91 |
| 1 | 1 | 515 | 1 | 0 | 0.15 | 1.33 | -0.53 | 0.19 | 0.55 |
| 1 | 1 | 516 | 1 | 0 | 0.32 | -1.61 | 0.86 | 1.86 | -0.88 |
| 1 | 1 | 517 | 3 | 0.15 | -1.48 | -1.36 | -0.34 | -0.01 | -0.83 |
| 1 | 1 | 518 | 2 | 0.05 | 0.50 | 0.15 | 0.13 | -0.32 | -0.14 |
| 1 | 1 | 519 | 2 | 0.08 | -0.13 | 1.05 | 0.21 | -0.40 | -0.11 |
| 1 | 1 | 520 | 2 | 0.12 | 1.28 | 0.73 | 0.86 | 0.03 | 1.24 |
| 1 | 1 | 521 | 1 | 0 | -0.65 | 1.11 | 1.00 | -0.07 | -1.34 |
| 1 | 1 | 522 | 2 | 0.15 | -0.45 | 1.16 | -1.99 | -0.46 | 0.57 |
| 1 | 1 | 523 | 2 | 0.17 | 0.92 | 1.63 | 1.69 | -0.14 | -0.69 |
| 1 | 1 | 524 | 1 | 0 | -0.48 | 0.25 | 0.00 | -0.06 | -1.04 |
| 1 | 1 | 525 | 1 | 0 | -2.32 | 1.20 | 0.81 | 0.62 | -0.43 |
| 1 | 1 | 526 | 3 | 0.06 | 0.44 | -0.23 | -0.40 | 1.01 | -0.10 |
| 1 | 1 | 527 | 1 | 0 | -0.52 | 0.82 | -0.02 | -1.16 | -1.18 |
| 1 | 1 | 528 | 2 | 0.15 | 0.46 | 0.67 | -0.82 | 1.70 | 0.55 |
| 1 | 1 | 529 | 3 | 0.15 | 0.62 | -0.15 | -0.55 | -0.66 | 0.66 |
| 1 | 1 | 530 | 2 | 0.11 | -0.98 | 0.80 | 1.00 | -0.50 | 0.13 |
| 1 | 1 | 531 | 2 | 0.09 | -1.19 | -1.67 | -1.42 | -0.78 | 1.11 |
| 1 | 1 | 532 | 2 | 0.1 | 0.04 | 2.14 | -0.31 | 0.40 | 0.43 |
| 1 | 1 | 533 | 1 | 0 | 0.37 | -1.61 | -0.90 | 1.32 | 0.01 |
| 1 | 1 | 534 | 2 | 0.13 | -1.54 | 0.28 | -0.60 | -0.21 | 0.39 |
| 1 | 1 | 535 | 1 | 0 | -0.66 | -1.93 | 0.39 | -0.49 | -1.74 |
| 1 | 1 | 536 | 1 | 0 | 0.50 | 1.61 | -2.23 | 0.45 | 2.47 |
| 1 | 1 | 537 | 1 | 0 | -0.90 | -0.44 | -0.87 | 1.08 | 0.46 |
| 1 | 1 | 538 | 2 | 0.1 | -1.08 | 0.92 | -0.47 | -2.26 | -0.34 |
| 1 | 1 | 539 | 2 | 0.19 | -0.60 | -1.42 | -1.83 | -0.35 | 1.49 |
| 1 | 1 | 540 | 1 | 0 | 0.94 | 0.15 | -0.29 | -0.04 | -1.35 |
| 1 | 1 | 541 | 2 | 0.05 | 0.82 | 0.52 | -0.30 | 0.60 | -0.45 |
| 1 | 1 | 542 | 2 | 0.17 | -1.29 | 0.63 | -1.58 | -0.39 | 1.32 |
| 1 | 1 | 543 | 1 | 0 | 1.67 | -1.12 | -0.12 | 0.56 | -3.12 |
| 1 | 1 | 544 | 2 | 0.11 | 0.30 | -0.84 | 0.20 | 0.55 | 1.56 |
| 1 | 1 | 545 | 1 | 0 | 0.67 | -1.69 | -0.80 | 0.19 | -1.31 |
| 1 | 1 | 546 | 2 | 0.11 | 0.69 | 0.13 | 1.38 | -0.88 | 0.07 |
| 1 | 1 | 547 | 3 | 0.18 | 1.13 | -0.24 | -0.41 | -0.22 | 0.04 |
| 1 | 1 | 548 | 1 | 0 | -0.21 | 0.41 | -0.82 | 0.06 | -0.67 |
| 1 | 1 | 549 | 1 | 0 | 0.29 | 0.97 | 1.61 | -1.13 | -0.63 |
| 1 | 1 | 550 | 4 | 0.17 | 1.34 | 0.55 | -0.82 | 0.12 | -0.94 |
| 1 | 1 | 551 | 2 | 0.08 | -0.03 | 0.03 | 0.94 | -0.29 | 0.74 |
| 1 | 1 | 552 | 2 | 0.15 | -1.33 | -0.88 | -0.50 | -0.84 | -1.31 |
| 1 | 1 | 553 | 2 | 0.13 | -1.12 | -1.31 | 0.20 | -0.17 | 1.37 |
| 1 | 1 | 554 | 2 | 0.13 | 0.51 | 0.06 | 1.09 | 0.13 | 1.50 |
| 1 | 1 | 555 | 2 | 0.08 | 0.35 | -0.48 | 2.21 | 1.10 | 0.86 |
| 1 | 1 | 556 | 2 | 0.08 | -0.71 | -0.30 | -1.74 | 0.27 | -0.31 |
| 1 | 1 | 557 | 2 | 0.15 | -0.40 | 1.84 | -0.57 | -0.84 | 1.27 |
| 1 | 1 | 558 | 1 | 0 | -1.34 | 0.38 | 0.18 | -0.35 | -2.08 |
| 1 | 1 | 559 | 2 | 0.14 | -0.59 | -0.10 | -1.72 | -0.25 | 0.94 |
| 1 | 1 | 560 | 1 | 0 | -1.19 | 0.85 | 0.84 | 0.21 | 1.69 |
| 1 | 1 | 561 | 1 | 0 | -0.68 | 0.60 | -0.64 | 0.79 | -0.06 |
| 1 | 1 | 562 | 1 | 0 | -2.24 | 0.62 | -0.56 | -1.49 | -1.22 |
| 1 | 1 | 563 | 3 | 0.11 | -0.50 | 0.25 | 0.45 | -0.75 | -0.33 |
| 1 | 1 | 564 | 1 | 0 | 0.00 | 0.09 | 0.70 | 0.22 | -0.42 |
| 1 | 1 | 565 | 1 | 0 | 0.34 | -1.00 | -2.00 | -0.53 | -0.19 |
| 1 | 1 | 566 | 1 | 0 | -0.56 | 1.58 | -1.46 | 0.48 | -1.44 |
| 1 | 1 | 567 | 1 | 0 | 1.44 | 2.33 | -0.49 | -0.03 | 1.82 |
| 1 | 1 | 568 | 2 | 0.1 | -0.45 | 1.20 | -0.26 | 0.72 | 1.65 |
| 1 | 1 | 569 | 1 | 0 | -0.98 | 2.08 | -0.86 | -0.14 | 0.75 |
| 1 | 1 | 570 | 3 | 0.14 | -0.87 | 0.89 | 0.42 | 0.64 | -0.38 |
| 1 | 1 | 571 | 1 | 0 | -1.77 | 0.74 | -1.66 | -1.13 | -1.46 |
| 1 | 1 | 572 | 1 | 0 | 2.04 | 0.09 | -0.40 | 1.07 | -0.04 |
| 1 | 1 | 573 | 1 | 0 | -0.63 | 0.95 | -2.84 | -1.65 | 0.55 |
| 1 | 1 | 574 | 1 | 0 | -0.60 | -0.86 | 1.38 | 0.39 | -0.61 |
| 1 | 1 | 575 | 1 | 0 | 0.25 | -0.06 | -0.02 | 0.19 | 1.59 |
| 1 | 1 | 576 | 1 | 0 | -0.06 | 0.31 | -1.02 | -0.75 | 0.12 |
| 1 | 1 | 577 | 2 | 0.19 | 1.28 | 0.56 | -2.57 | -0.23 | 0.38 |
| 1 | 1 | 578 | 1 | 0 | -0.84 | 0.41 | -0.97 | -1.56 | 0.47 |
| 1 | 1 | 579 | 2 | 0.16 | 1.39 | 0.23 | 0.28 | -1.63 | -1.06 |
| 1 | 1 | 580 | 3 | 0.13 | 0.19 | 0.27 | -0.73 | 0.93 | 0.08 |
| 1 | 1 | 581 | 2 | 0.11 | 0.87 | 0.15 | 0.38 | -1.07 | 0.33 |
| 1 | 1 | 582 | 2 | 0.11 | -1.10 | 0.37 | 0.05 | -0.05 | -0.23 |
| 1 | 1 | 583 | 1 | 0 | -0.46 | 1.23 | 0.82 | -1.51 | 2.07 |
| 1 | 1 | 584 | 1 | 0 | 0.18 | 1.03 | 0.61 | 0.87 | -0.51 |
| 1 | 1 | 585 | 2 | 0.18 | -0.08 | 0.68 | -1.31 | -0.58 | 2.40 |
| 1 | 1 | 586 | 2 | 0.09 | 0.78 | 0.62 | 1.15 | -0.93 | 0.13 |
| 1 | 1 | 587 | 3 | 0.11 | -1.18 | -0.24 | 0.67 | -0.27 | 0.40 |
| 1 | 1 | 588 | 2 | 0.1 | -0.62 | 1.74 | 1.52 | 0.05 | 0.02 |
| 1 | 1 | 589 | 3 | 0.13 | 0.05 | 0.44 | -1.44 | 0.71 | 0.27 |
| 1 | 1 | 590 | 2 | 0.11 | -0.13 | -0.36 | -0.04 | -1.15 | -0.07 |
| 1 | 1 | 591 | 2 | 0.08 | -0.18 | 0.89 | -0.68 | 0.05 | -0.16 |
| 1 | 1 | 592 | 1 | 0 | 0.32 | -0.76 | -1.52 | 1.15 | 0.71 |
| 1 | 1 | 593 | 1 | 0 | 1.86 | 1.14 | -1.05 | 1.18 | 1.34 |
| 1 | 1 | 594 | 1 | 0 | -1.62 | -0.43 | 0.01 | -0.68 | -0.08 |
| 1 | 1 | 595 | 1 | 0 | -0.15 | 0.18 | -0.94 | 1.27 | -0.21 |
| 1 | 1 | 596 | 1 | 0 | 1.70 | 0.85 | -0.63 | -0.62 | 0.12 |
| 1 | 1 | 597 | 2 | 0.11 | -1.35 | -1.05 | -0.47 | -0.72 | -0.22 |
| 1 | 1 | 598 | 1 | 0 | 2.11 | 2.93 | -0.59 | -1.33 | -2.31 |
| 1 | 1 | 599 | 1 | 0 | -0.23 | -0.20 | 0.84 | -2.19 | 1.90 |
| 1 | 1 | 600 | 1 | 0 | -0.47 | -0.70 | -0.66 | 3.10 | 0.69 |
| 1 | 1 | 601 | 1 | 0 | 0.98 | 0.63 | 0.51 | 1.94 | 0.92 |
| 1 | 1 | 602 | 1 | 0 | -2.64 | 0.01 | -0.09 | 1.47 | 0.10 |
| 1 | 1 | 603 | 1 | 0 | -1.22 | -0.24 | 0.71 | 1.27 | 1.10 |
| 1 | 1 | 604 | 1 | 0 | -2.27 | -1.07 | 2.46 | -2.30 | -0.39 |
| 1 | 1 | 605 | 1 | 0 | 0.60 | -0.19 | -1.22 | -1.67 | -0.74 |
| 1 | 1 | 606 | 2 | 0.1 | 0.51 | 1.20 | -1.18 | -0.06 | -0.46 |
| 1 | 1 | 607 | 2 | 0.11 | -0.91 | -0.42 | -1.26 | -0.18 | 1.79 |
| 1 | 1 | 608 | 1 | 0 | -0.81 | -1.65 | -0.16 | 1.94 | 1.45 |
| 1 | 1 | 609 | 1 | 0 | 0.41 | 1.05 | -1.06 | 2.56 | -0.91 |
| 1 | 1 | 610 | 1 | 0 | -2.56 | -1.26 | 1.11 | 0.43 | 1.80 |
| 1 | 1 | 611 | 1 | 0 | 0.99 | -0.03 | -0.43 | 1.66 | 0.15 |
| 1 | 1 | 612 | 1 | 0 | -0.77 | -0.88 | -0.03 | 0.28 | -0.54 |
| 1 | 1 | 613 | 2 | 0.13 | -0.33 | -1.59 | -0.87 | 1.76 | 0.17 |
| 1 | 1 | 614 | 2 | 0.12 | -0.64 | -0.28 | -0.95 | 1.16 | -1.30 |
| 1 | 1 | 615 | 1 | 0 | -0.13 | -2.10 | 1.66 | -1.13 | 0.28 |
| 1 | 1 | 616 | 1 | 0 | 0.32 | 1.01 | -0.15 | 1.25 | 1.16 |
| 1 | 1 | 617 | 2 | 0.13 | 0.90 | -0.46 | -2.06 | 0.22 | 1.89 |
| 1 | 1 | 618 | 3 | 0.09 | 0.02 | 0.33 | -0.10 | -0.29 | -0.67 |
| 1 | 1 | 619 | 2 | 0.08 | -0.13 | 0.02 | -0.47 | 0.20 | 0.48 |
| 1 | 1 | 620 | 1 | 0 | 0.37 | 0.99 | -0.23 | -0.70 | 2.22 |
| 1 | 1 | 621 | 1 | 0 | -1.22 | 1.40 | -0.48 | -1.48 | 0.32 |
| 1 | 1 | 622 | 3 | 0.14 | -0.09 | 1.82 | -1.16 | -1.69 | 0.49 |
| 1 | 1 | 623 | 1 | 0 | 0.37 | -0.66 | 0.24 | 0.55 | 0.35 |
| 1 | 1 | 624 | 2 | 0.12 | -0.99 | 0.61 | 0.75 | 0.64 | 0.41 |
| 1 | 1 | 625 | 2 | 0.12 | 1.01 | -0.66 | -1.75 | 0.44 | -0.11 |
| 1 | 1 | 626 | 2 | 0.18 | -0.49 | -2.63 | 0.11 | 0.18 | -0.44 |
| 1 | 1 | 627 | 1 | 0 | 0.07 | -0.43 | 0.05 | -0.19 | 0.03 |
| 1 | 1 | 628 | 1 | 0 | -0.56 | 0.26 | 1.11 | -0.96 | 1.59 |
| 1 | 1 | 629 | 1 | 0 | -0.91 | 0.60 | 0.85 | -0.09 | 0.76 |
| 1 | 1 | 630 | 1 | 0 | -0.49 | 1.63 | 2.52 | 0.49 | -1.13 |
| 1 | 1 | 631 | 2 | 0.1 | -0.12 | -0.78 | 0.24 | -0.57 | 0.38 |
| 1 | 1 | 632 | 1 | 0 | -3.60 | -0.11 | -0.98 | 0.86 | -1.44 |
| 1 | 1 | 633 | 1 | 0 | -0.81 | -0.46 | -0.21 | -1.04 | -2.36 |
| 1 | 1 | 634 | 1 | 0 | -1.90 | 0.92 | 1.29 | 1.27 | 0.07 |
| 1 | 1 | 635 | 3 | 0.13 | 0.27 | 0.49 | 0.89 | -0.40 | -0.09 |
| 1 | 1 | 636 | 2 | 0.15 | -2.21 | -0.31 | 0.52 | -0.26 | -0.47 |
| 1 | 1 | 637 | 1 | 0 | -2.59 | -1.36 | 1.24 | -0.22 | -0.29 |
| 1 | 1 | 638 | 1 | 0 | 0.70 | 1.30 | 0.75 | 0.98 | 2.45 |
| 1 | 1 | 639 | 1 | 0 | 0.66 | -1.21 | 1.81 | -0.39 | 0.03 |
| 1 | 1 | 640 | 1 | 0 | 1.09 | -1.73 | 0.95 | -2.03 | 0.56 |
| 1 | 1 | 641 | 2 | 0.12 | 1.14 | 0.80 | 0.23 | 1.05 | -0.95 |
| 1 | 1 | 642 | 1 | 0 | 0.99 | -0.78 | -1.56 | -1.00 | -1.25 |
| 1 | 1 | 643 | 1 | 0 | 0.41 | 0.59 | -1.46 | -1.84 | 1.24 |
| 1 | 1 | 644 | 2 | 0.15 | 0.01 | -0.68 | -0.95 | -1.63 | -1.13 |
| 1 | 1 | 645 | 2 | 0.07 | -1.60 | 0.77 | 0.14 | -0.53 | 0.13 |
| 1 | 1 | 646 | 2 | 0.06 | -0.56 | 0.38 | -0.49 | 0.38 | -0.07 |
| 1 | 1 | 647 | 2 | 0.11 | -1.08 | -1.32 | 0.68 | 0.33 | 0.43 |
| 1 | 1 | 648 | 1 | 0 | -0.99 | 0.10 | -0.27 | -0.60 | -0.54 |
| 1 | 1 | 649 | 1 | 0 | 0.90 | -0.06 | -0.67 | 0.87 | 0.84 |
| 1 | 1 | 650 | 3 | 0.2 | 1.40 | -0.93 | -1.74 | -0.29 | 0.35 |
| 1 | 1 | 651 | 1 | 0 | 0.85 | -0.88 | 0.94 | -0.80 | 1.07 |
| 1 | 1 | 652 | 1 | 0 | -0.49 | -0.45 | -0.08 | -1.75 | -0.01 |
| 1 | 1 | 653 | 1 | 0 | 2.60 | -0.33 | 0.17 | -2.06 | -1.47 |
| 1 | 1 | 654 | 1 | 0 | 1.85 | 1.64 | -1.35 | -1.44 | 1.80 |
| 1 | 1 | 655 | 1 | 0 | 0.31 | 0.86 | 0.35 | 0.41 | 0.99 |
| 1 | 1 | 656 | 1 | 0 | -1.27 | -0.04 | 0.84 | -0.11 | 2.14 |
| 1 | 1 | 657 | 1 | 0 | 0.24 | 0.82 | 1.38 | -1.08 | -0.56 |
| 1 | 1 | 658 | 1 | 0 | -1.00 | 1.45 | 0.00 | 1.51 | -0.12 |
| 1 | 1 | 659 | 1 | 0 | -0.23 | 0.88 | 0.81 | 0.41 | 1.74 |
| 1 | 1 | 660 | 2 | 0.07 | 0.62 | -1.05 | 1.06 | -0.42 | 0.61 |
| 1 | 1 | 661 | 1 | 0 | 0.44 | -0.29 | -0.06 | 0.64 | -0.13 |
| 1 | 1 | 662 | 2 | 0.09 | 1.05 | -0.37 | 0.15 | 0.39 | -0.64 |
| 1 | 1 | 663 | 2 | 0.12 | 0.41 | 0.55 | 0.15 | 0.85 | 0.02 |
| 1 | 1 | 664 | 1 | 0 | 0.33 | 0.01 | -0.44 | 0.63 | 0.03 |
| 1 | 1 | 665 | 3 | 0.23 | 0.77 | -0.03 | 0.38 | -0.20 | 2.77 |
| 1 | 1 | 666 | 1 | 0 | -0.67 | -0.40 | 0.23 | -0.84 | 0.61 |
| 1 | 1 | 667 | 2 | 0.14 | -0.35 | 0.94 | -2.06 | 0.58 | -0.71 |
| 1 | 1 | 668 | 2 | 0.12 | -2.15 | -0.72 | -0.33 | -2.08 | 0.04 |
| 1 | 1 | 669 | 2 | 0.12 | -0.09 | 0.05 | 0.35 | -0.75 | -1.41 |
| 1 | 1 | 670 | 1 | 0 | -2.05 | -1.55 | -2.23 | 0.21 | -1.46 |
| 1 | 1 | 671 | 2 | 0.16 | 1.48 | 1.27 | -0.33 | 1.89 | 0.67 |
| 1 | 1 | 672 | 3 | 0.2 | 0.09 | -1.55 | -1.03 | -2.01 | -0.11 |
| 1 | 1 | 673 | 1 | 0 | -0.07 | 0.67 | 0.33 | -1.10 | -0.31 |
| 1 | 1 | 674 | 1 | 0 | -0.25 | 0.54 | -0.78 | 1.99 | -0.84 |
| 1 | 1 | 675 | 1 | 0 | 1.01 | 0.07 | 1.13 | -2.17 | 0.77 |
| 1 | 1 | 676 | 1 | 0 | -0.84 | -0.17 | -0.72 | 0.22 | -0.50 |
| 1 | 1 | 677 | 1 | 0 | 0.30 | -0.56 | -0.98 | 0.20 | -1.17 |
| 1 | 1 | 678 | 1 | 0 | 0.59 | -0.03 | 0.14 | -1.77 | -0.08 |
| 1 | 1 | 679 | 2 | 0.13 | -0.29 | -0.26 | -0.69 | -1.17 | 0.69 |
| 1 | 1 | 680 | 2 | 0.13 | 1.02 | -0.75 | 0.85 | 0.58 | 2.28 |
| 1 | 1 | 681 | 2 | 0.11 | 0.29 | 1.68 | -1.10 | 1.02 | 0.58 |
| 1 | 1 | 682 | 1 | 0 | -1.23 | 1.17 | 0.07 | -2.20 | 1.42 |
| 1 | 1 | 683 | 1 | 0 | 0.24 | -0.96 | -0.18 | -0.09 | 1.90 |
| 1 | 1 | 684 | 1 | 0 | -1.01 | -0.32 | 1.82 | -0.60 | -2.10 |
| 1 | 1 | 685 | 2 | 0.22 | 1.60 | 0.90 | 0.54 | -1.82 | 1.67 |
| 1 | 1 | 686 | 1 | 0 | 0.11 | 2.25 | -0.48 | -1.37 | 2.22 |
| 1 | 1 | 687 | 3 | 0.12 | 1.00 | 0.02 | 0.81 | -0.62 | 0.10 |
| 1 | 1 | 688 | 1 | 0 | -0.40 | 0.45 | -0.87 | 1.11 | 0.12 |
| 1 | 1 | 689 | 1 | 0 | 0.30 | -1.24 | 2.81 | 0.80 | -0.50 |
| 1 | 1 | 690 | 3 | 0.12 | -0.87 | 0.51 | -0.31 | -1.34 | 0.76 |
| 1 | 1 | 691 | 2 | 0.09 | -0.29 | 0.49 | 0.23 | 1.42 | 0.08 |
| 1 | 1 | 692 | 2 | 0.06 | 0.25 | 1.06 | -0.44 | -0.43 | 0.88 |
| 1 | 1 | 693 | 2 | 0.09 | -0.85 | -0.29 | 1.38 | 1.64 | -0.85 |
| 1 | 1 | 694 | 3 | 0.1 | 0.49 | -0.21 | 0.80 | -0.30 | -0.75 |
| 1 | 1 | 695 | 2 | 0.14 | -0.35 | 1.78 | -0.30 | -0.13 | -1.22 |
| 1 | 1 | 696 | 2 | 0.09 | -0.23 | 0.35 | 1.50 | -0.79 | 0.62 |
| 1 | 1 | 697 | 1 | 0 | 1.29 | -1.36 | 0.50 | 0.14 | 0.17 |
| 1 | 1 | 698 | 1 | 0 | -0.42 | 0.58 | -0.95 | -0.97 | 1.44 |
| 1 | 1 | 699 | 2 | 0.14 | -0.62 | -1.17 | -0.03 | 0.32 | 0.76 |
| 1 | 1 | 700 | 2 | 0.1 | 0.21 | 0.07 | 0.81 | -1.31 | 0.14 |
| 1 | 1 | 701 | 1 | 0 | -0.16 | -1.57 | -0.17 | -0.81 | 0.37 |
| 1 | 1 | 702 | 1 | 0 | -0.74 | -1.33 | 0.66 | 0.86 | 1.12 |
| 1 | 1 | 703 | 1 | 0 | 1.95 | -1.61 | 1.28 | 2.08 | 0.44 |
| 1 | 1 | 704 | 3 | 0.14 | 0.76 | -1.22 | 0.47 | -0.20 | -0.72 |
| 1 | 1 | 705 | 2 | 0.09 | -1.69 | -0.08 | -0.63 | 1.26 | -0.21 |
| 1 | 1 | 706 | 1 | 0 | 0.75 | 0.51 | 1.64 | -1.23 | -1.17 |
| 1 | 1 | 707 | 1 | 0 | -0.07 | 1.90 | 0.75 | 1.57 | -1.95 |
| 1 | 1 | 708 | 2 | 0.06 | -0.28 | 0.41 | -0.46 | 1.61 | 0.84 |
| 1 | 1 | 709 | 2 | 0.2 | -1.42 | -0.69 | 1.30 | -1.65 | -0.25 |
| 1 | 1 | 710 | 1 | 0 | -0.92 | -1.17 | 0.19 | -1.43 | -0.59 |
| 1 | 1 | 711 | 1 | 0 | 1.91 | -0.23 | -0.62 | 0.14 | -0.14 |
| 1 | 1 | 712 | 2 | 0.16 | -1.74 | 0.09 | 1.21 | -1.96 | -0.08 |
| 1 | 1 | 713 | 1 | 0 | 0.22 | 2.25 | -0.79 | 0.18 | -1.23 |
| 1 | 1 | 714 | 2 | 0.09 | 0.05 | -0.04 | 1.37 | -0.23 | -0.97 |
| 1 | 1 | 715 | 2 | 0.15 | -1.75 | 0.73 | -1.65 | 1.08 | 0.20 |
| 1 | 1 | 716 | 2 | 0.12 | 0.82 | 1.07 | 1.76 | 1.63 | -1.46 |
| 1 | 1 | 717 | 4 | 0.2 | 0.45 | 1.69 | -0.29 | 0.62 | -1.09 |
| 1 | 1 | 718 | 3 | 0.17 | 0.35 | 1.72 | -0.19 | -0.57 | 0.45 |
| 1 | 1 | 719 | 3 | 0.13 | -0.05 | 0.18 | 0.99 | -0.84 | -1.42 |
| 1 | 1 | 720 | 1 | 0 | 1.02 | 1.88 | 1.25 | 0.37 | -1.45 |
| 1 | 1 | 721 | 1 | 0 | 0.46 | -1.55 | 0.43 | 0.07 | 2.25 |
| 1 | 1 | 722 | 1 | 0 | -1.22 | -1.02 | -0.14 | 1.78 | 0.53 |
| 1 | 1 | 723 | 2 | 0.25 | 0.19 | -0.42 | 1.42 | 0.70 | -2.66 |
| 1 | 1 | 724 | 1 | 0 | 0.04 | -1.18 | 0.24 | 1.65 | 0.70 |
| 1 | 1 | 725 | 1 | 0 | 1.26 | 1.54 | -1.37 | -0.58 | 0.90 |
| 1 | 1 | 726 | 1 | 0 | -0.71 | 1.30 | -1.22 | 0.04 | -0.13 |
| 1 | 1 | 727 | 1 | 0 | -0.34 | 0.24 | -0.61 | -2.21 | 1.16 |
| 1 | 1 | 728 | 1 | 0 | -0.53 | 1.01 | -0.69 | -0.77 | 0.85 |
| 1 | 1 | 729 | 1 | 0 | 1.69 | 0.50 | -1.73 | -0.38 | 1.84 |
| 1 | 1 | 730 | 1 | 0 | 2.94 | 0.64 | 0.53 | 0.65 | -1.03 |
| 1 | 1 | 731 | 3 | 0.18 | 1.33 | 0.25 | -0.31 | 0.54 | 0.69 |
| 1 | 1 | 732 | 1 | 0 | -2.06 | -0.99 | 0.87 | 0.07 | 0.79 |
| 1 | 1 | 733 | 2 | 0.1 | 0.13 | -0.03 | -0.88 | 0.05 | -0.55 |
| 1 | 1 | 734 | 1 | 0 | -0.18 | -1.68 | 0.01 | 0.73 | 0.13 |
| 1 | 1 | 735 | 2 | 0.09 | -1.17 | -0.92 | -0.35 | 1.66 | -0.45 |
| 1 | 1 | 736 | 2 | 0.18 | 2.01 | -1.03 | -0.87 | 0.13 | -0.57 |
| 1 | 1 | 737 | 1 | 0 | 0.05 | -0.47 | -0.13 | 1.82 | -0.09 |
| 1 | 1 | 738 | 2 | 0.08 | 0.05 | -0.10 | 1.49 | 0.26 | -1.34 |
| 1 | 1 | 739 | 1 | 0 | -1.84 | -0.05 | 2.59 | 0.80 | 2.90 |
| 1 | 1 | 740 | 2 | 0.12 | 0.06 | 0.45 | 1.16 | -0.82 | -1.98 |
| 1 | 1 | 741 | 1 | 0 | -0.10 | 2.30 | -2.13 | -1.06 | -0.67 |
| 1 | 1 | 742 | 1 | 0 | -0.23 | -0.44 | 0.33 | -2.79 | 1.05 |
| 1 | 1 | 743 | 2 | 0.08 | -0.70 | 0.40 | -0.30 | -0.55 | 0.91 |
| 1 | 1 | 744 | 1 | 0 | -1.30 | -0.48 | -0.12 | -0.83 | 0.68 |
| 1 | 1 | 745 | 2 | 0.09 | 0.12 | -0.44 | -0.75 | -0.84 | -0.10 |
| 1 | 1 | 746 | 1 | 0 | 0.13 | -0.84 | 0.09 | -0.25 | -0.39 |
| 1 | 1 | 747 | 1 | 0 | 2.94 | 1.71 | -0.29 | 0.72 | 1.70 |
| 1 | 1 | 748 | 1 | 0 | -0.26 | -0.98 | -0.56 | -2.20 | -0.69 |
| 1 | 1 | 749 | 2 | 0.1 | -0.22 | -0.83 | 0.43 | -0.87 | -1.68 |
| 1 | 1 | 750 | 1 | 0 | -0.22 | 0.52 | 1.52 | 0.50 | 0.13 |
| 1 | 1 | 751 | 1 | 0 | 1.80 | -1.99 | 1.39 | -1.57 | 1.65 |
| 1 | 1 | 752 | 2 | 0.1 | -0.84 | -2.09 | 0.50 | 0.91 | 0.13 |
| 1 | 1 | 753 | 1 | 0 | -0.30 | 0.00 | -1.40 | -2.48 | -0.78 |
| 1 | 1 | 754 | 1 | 0 | -1.73 | 0.93 | -0.45 | 0.59 | -1.04 |
| 1 | 1 | 755 | 2 | 0.11 | 0.68 | -0.49 | -0.22 | 0.52 | -1.23 |
| 1 | 1 | 756 | 2 | 0.11 | 0.80 | 0.39 | -0.72 | -1.33 | -0.73 |
| 1 | 1 | 757 | 1 | 0 | 0.63 | 0.55 | 1.37 | -1.42 | -1.03 |
| 1 | 1 | 758 | 1 | 0 | 0.25 | 0.45 | -0.55 | -0.15 | -0.28 |
| 1 | 1 | 759 | 2 | 0.1 | 0.21 | -1.10 | -0.57 | -0.69 | 0.00 |
| 1 | 1 | 760 | 3 | 0.13 | 0.56 | -1.24 | 0.38 | 0.83 | 0.51 |
| 1 | 1 | 761 | 2 | 0.1 | -0.16 | 0.15 | 0.67 | 0.65 | 1.11 |
| 1 | 1 | 762 | 1 | 0 | 0.60 | -1.91 | 0.77 | -0.33 | 1.11 |
| 1 | 1 | 763 | 2 | 0.09 | -1.66 | -0.46 | 0.15 | 0.77 | 1.51 |
| 1 | 1 | 764 | 1 | 0 | -0.21 | 1.29 | 0.23 | -0.02 | -1.10 |
| 1 | 1 | 765 | 1 | 0 | 1.81 | -0.16 | 1.08 | -1.77 | -1.20 |
| 1 | 1 | 766 | 1 | 0 | -2.97 | 1.00 | 0.88 | -0.98 | -0.51 |
| 1 | 1 | 767 | 2 | 0.13 | -0.72 | -0.57 | -0.02 | 1.40 | -1.13 |
| 1 | 1 | 768 | 2 | 0.12 | 0.51 | 0.80 | 0.17 | -1.90 | -0.18 |
| 1 | 1 | 769 | 2 | 0.1 | 0.59 | 0.20 | 0.48 | 0.18 | 1.41 |
| 1 | 1 | 770 | 1 | 0 | 0.90 | -0.70 | 0.10 | 0.71 | -1.00 |
| 1 | 1 | 771 | 1 | 0 | 0.77 | -0.34 | 1.10 | -0.65 | 1.58 |
| 1 | 1 | 772 | 1 | 0 | -0.72 | 0.31 | -2.23 | 1.33 | 0.00 |
| 1 | 1 | 773 | 1 | 0 | 0.71 | 1.41 | -0.48 | -1.31 | -0.03 |
| 1 | 1 | 774 | 1 | 0 | -1.63 | -0.28 | -0.14 | 0.06 | 0.01 |
| 1 | 1 | 775 | 2 | 0.15 | -1.24 | 0.19 | 0.81 | 0.01 | -0.64 |
| 1 | 1 | 776 | 1 | 0 | -2.34 | -1.26 | 0.94 | -0.53 | -0.91 |
| 1 | 1 | 777 | 2 | 0.14 | 0.96 | 0.10 | -1.17 | -1.75 | -0.01 |
| 1 | 1 | 778 | 2 | 0.18 | 1.99 | 1.61 | 1.90 | -0.65 | 0.23 |
| 1 | 1 | 779 | 1 | 0 | 2.94 | 1.19 | 0.56 | 0.15 | 0.82 |
| 1 | 1 | 780 | 1 | 0 | 1.56 | 0.21 | -0.30 | 1.96 | -0.92 |
| 1 | 1 | 781 | 1 | 0 | 0.03 | -0.38 | -0.27 | -2.47 | 0.35 |
| 1 | 1 | 782 | 1 | 0 | -2.46 | -0.30 | 0.70 | -1.30 | 0.62 |
| 1 | 1 | 783 | 1 | 0 | 0.13 | -2.53 | 1.18 | 0.68 | 2.07 |
| 1 | 1 | 784 | 2 | 0.11 | -0.37 | 0.36 | 0.42 | 0.90 | -0.06 |
| 1 | 1 | 785 | 1 | 0 | 1.85 | -0.43 | -0.34 | -1.41 | 1.49 |
| 1 | 1 | 786 | 1 | 0 | 0.28 | 1.26 | 1.39 | 0.79 | 1.72 |
| 1 | 1 | 787 | 1 | 0 | -0.33 | -1.76 | 1.06 | -1.64 | -0.83 |
| 1 | 1 | 788 | 2 | 0.08 | 0.31 | -0.95 | -0.52 | 1.13 | -0.08 |
| 1 | 1 | 789 | 2 | 0.21 | 1.53 | -1.21 | -1.21 | 0.88 | -1.19 |
| 1 | 1 | 790 | 1 | 0 | -0.98 | 0.57 | 1.46 | -1.11 | 0.72 |
| 1 | 1 | 791 | 3 | 0.18 | -0.13 | -0.13 | -1.47 | 1.53 | -1.07 |
| 1 | 1 | 792 | 2 | 0.11 | 0.15 | -0.87 | -1.05 | 2.31 | -0.16 |
| 1 | 1 | 793 | 1 | 0 | 0.88 | -0.22 | -1.67 | -0.46 | -0.14 |
| 1 | 1 | 794 | 1 | 0 | -0.79 | 0.80 | -1.10 | 0.54 | 0.21 |
| 1 | 1 | 795 | 1 | 0 | 0.17 | -0.36 | -1.32 | 1.15 | 1.24 |
| 1 | 1 | 796 | 2 | 0.16 | -1.57 | 1.02 | 2.00 | 0.87 | -0.67 |
| 1 | 1 | 797 | 2 | 0.08 | 0.10 | 0.67 | 0.82 | 0.89 | 0.21 |
| 1 | 1 | 798 | 2 | 0.08 | -0.51 | 0.18 | 0.87 | 0.80 | -0.86 |
| 1 | 1 | 799 | 2 | 0.06 | 0.32 | -0.37 | -0.15 | -0.59 | -0.16 |
| 1 | 1 | 800 | 2 | 0.1 | -0.90 | 0.66 | -1.29 | -0.14 | 0.85 |
| 1 | 1 | 801 | 2 | 0.07 | 0.79 | -1.09 | 0.33 | -0.59 | 0.25 |
| 1 | 1 | 802 | 1 | 0 | 0.55 | -0.53 | 1.99 | 1.08 | 0.15 |
| 1 | 1 | 803 | 2 | 0.08 | -0.37 | 1.03 | 1.21 | 1.79 | 0.04 |
| 1 | 1 | 804 | 2 | 0.13 | 0.45 | 0.58 | 0.06 | 0.49 | -1.68 |
| 1 | 1 | 805 | 3 | 0.11 | -0.35 | -0.48 | 0.04 | 0.71 | -0.93 |
| 1 | 1 | 806 | 4 | 0.15 | -0.84 | 0.01 | -0.92 | -0.68 | -0.25 |
| 1 | 1 | 807 | 1 | 0 | -0.88 | -0.28 | 0.11 | 0.26 | -2.64 |
| 1 | 1 | 808 | 1 | 0 | 1.95 | -0.28 | -1.34 | -1.71 | -0.43 |
| 1 | 1 | 809 | 1 | 0 | 0.22 | 0.13 | 1.21 | 0.25 | 0.34 |
| 1 | 1 | 810 | 1 | 0 | 1.07 | -0.31 | -0.28 | 0.71 | -0.31 |
| 1 | 1 | 811 | 2 | 0.07 | -0.38 | -1.01 | -1.64 | 0.98 | 0.05 |
| 1 | 1 | 812 | 1 | 0 | -1.47 | 2.02 | -0.48 | -1.81 | -1.59 |
| 1 | 1 | 813 | 1 | 0 | 1.96 | 1.28 | -1.34 | -0.55 | -1.54 |
| 1 | 1 | 814 | 1 | 0 | -1.00 | -0.41 | 0.79 | 0.73 | -1.31 |
| 1 | 1 | 815 | 1 | 0 | -0.98 | -0.80 | 0.51 | 0.25 | 1.35 |
| 1 | 1 | 816 | 1 | 0 | -0.60 | 2.34 | 0.37 | -0.27 | 0.18 |
| 1 | 1 | 817 | 1 | 0 | 0.36 | 0.94 | 0.00 | -0.04 | 0.98 |
| 1 | 1 | 818 | 2 | 0.11 | -0.13 | 0.01 | -1.47 | 0.77 | 0.82 |
| 1 | 1 | 819 | 1 | 0 | 0.66 | -0.34 | -0.97 | 0.79 | 0.10 |
| 1 | 1 | 820 | 2 | 0.12 | -0.39 | 0.41 | 1.50 | -0.37 | -0.33 |
| 1 | 1 | 821 | 1 | 0 | 0.10 | 0.04 | -1.50 | 0.92 | 1.95 |
| 1 | 1 | 822 | 1 | 0 | 0.22 | 1.31 | 0.45 | -1.00 | -1.41 |
| 1 | 1 | 823 | 1 | 0 | 0.54 | 1.84 | 1.23 | 1.12 | 0.46 |
| 1 | 1 | 824 | 1 | 0 | -2.17 | 2.39 | -0.48 | 0.75 | 0.13 |
| 1 | 1 | 825 | 2 | 0.18 | -0.25 | -0.54 | -1.08 | -1.73 | 0.44 |
| 1 | 1 | 826 | 1 | 0 | -0.78 | 1.31 | 1.10 | 0.84 | -1.05 |
| 1 | 1 | 827 | 2 | 0.15 | -1.10 | 0.22 | 1.67 | 1.45 | 0.71 |
| 1 | 1 | 828 | 1 | 0 | 0.94 | -1.76 | 0.46 | -1.34 | -0.31 |
| 1 | 1 | 829 | 1 | 0 | -0.52 | 1.93 | -0.24 | 2.31 | -0.70 |
| 1 | 1 | 830 | 2 | 0.1 | -0.14 | -0.77 | -0.10 | -0.03 | 0.74 |
| 1 | 1 | 831 | 2 | 0.11 | -0.59 | 1.56 | -0.02 | -1.27 | 0.34 |
| 1 | 1 | 832 | 1 | 0 | 1.17 | -2.20 | 1.92 | 0.63 | 0.03 |
| 1 | 1 | 833 | 1 | 0 | 0.49 | -1.12 | 0.38 | -1.23 | -0.03 |
| 1 | 1 | 834 | 1 | 0 | -0.05 | -0.08 | 1.84 | 0.38 | 0.28 |
| 1 | 1 | 835 | 2 | 0.08 | 0.96 | -0.60 | -0.17 | -1.01 | -0.83 |
| 1 | 1 | 836 | 1 | 0 | 0.58 | -0.84 | 0.50 | -0.94 | 2.24 |
| 1 | 1 | 837 | 1 | 0 | -0.03 | -0.28 | -1.49 | 2.53 | -0.66 |
| 1 | 1 | 838 | 1 | 0 | -1.79 | -0.34 | 0.71 | 0.88 | -2.10 |
| 1 | 1 | 839 | 3 | 0.12 | 0.38 | -0.28 | -0.01 | 0.06 | -0.85 |
| 1 | 1 | 840 | 1 | 0 | -1.08 | -0.50 | 1.30 | 1.11 | -0.21 |
| 1 | 1 | 841 | 1 | 0 | -1.59 | -0.10 | 1.20 | -0.17 | -1.47 |
| 1 | 1 | 842 | 2 | 0.07 | 0.48 | -1.38 | -1.31 | -0.53 | 0.06 |
| 1 | 1 | 843 | 1 | 0 | 0.40 | 0.63 | 1.13 | 1.36 | -0.06 |
| 1 | 1 | 844 | 2 | 0.08 | 1.56 | -0.22 | 0.22 | 0.71 | -0.45 |
| 1 | 1 | 845 | 1 | 0 | 0.29 | -0.95 | 0.53 | -2.33 | 0.85 |
| 1 | 1 | 846 | 2 | 0.09 | -1.29 | 1.41 | 0.04 | -0.40 | -0.76 |
| 1 | 1 | 847 | 1 | 0 | 0.38 | -0.67 | -1.18 | -0.19 | 0.79 |
| 1 | 1 | 848 | 3 | 0.17 | 0.69 | -0.71 | -0.28 | -0.44 | -0.98 |
| 1 | 1 | 849 | 1 | 0 | -0.90 | 1.21 | 0.39 | -1.10 | 0.42 |
| 1 | 1 | 850 | 1 | 0 | 0.35 | 0.82 | -1.74 | 1.45 | -0.98 |
| 1 | 1 | 851 | 3 | 0.12 | -0.94 | 0.12 | 0.29 | -0.39 | 0.68 |
| 1 | 1 | 852 | 1 | 0 | -0.96 | 0.19 | 0.95 | -0.75 | 0.93 |
| 1 | 1 | 853 | 1 | 0 | 0.61 | 1.55 | 1.25 | 0.03 | 1.79 |
| 1 | 1 | 854 | 2 | 0.16 | 0.99 | 0.89 | 0.40 | 1.98 | -0.38 |
| 1 | 1 | 855 | 2 | 0.17 | 2.13 | -0.63 | 0.63 | 1.18 | 0.20 |
| 1 | 1 | 856 | 2 | 0.13 | 0.24 | 1.88 | 0.74 | 0.39 | -0.27 |
| 1 | 1 | 857 | 2 | 0.12 | -0.33 | 0.04 | 0.94 | -0.92 | 0.55 |
| 1 | 1 | 858 | 1 | 0 | 0.07 | 1.39 | -1.31 | 0.17 | -0.06 |
| 1 | 1 | 859 | 3 | 0.14 | 0.22 | 0.01 | -1.98 | 1.60 | -0.05 |
| 1 | 1 | 860 | 1 | 0 | 1.39 | 0.93 | 1.88 | -0.74 | 0.12 |
| 1 | 1 | 861 | 2 | 0.15 | 1.33 | -1.64 | 1.61 | -0.59 | 1.40 |
| 1 | 1 | 862 | 2 | 0.1 | 0.21 | -0.59 | 0.22 | -2.29 | 1.43 |
| 1 | 1 | 863 | 2 | 0.08 | 0.23 | 0.02 | -0.79 | -0.25 | 0.20 |
| 1 | 1 | 864 | 2 | 0.06 | -0.14 | -0.14 | 0.35 | 0.48 | 0.52 |
| 1 | 1 | 865 | 1 | 0 | 1.50 | -2.67 | 0.51 | 0.46 | 1.46 |
| 1 | 1 | 866 | 1 | 0 | -0.13 | -2.07 | -0.42 | -0.01 | 0.42 |
| 1 | 1 | 867 | 1 | 0 | -0.46 | 0.23 | 2.10 | 0.09 | -0.22 |
| 1 | 1 | 868 | 1 | 0 | 1.46 | 1.08 | -0.65 | 0.24 | 1.72 |
| 1 | 1 | 869 | 3 | 0.11 | -0.55 | -0.65 | 0.22 | 0.65 | 0.48 |
| 1 | 1 | 870 | 1 | 0 | 0.46 | -1.18 | 0.75 | 0.11 | 0.14 |
| 1 | 1 | 871 | 1 | 0 | -0.37 | -1.05 | 0.14 | 1.29 | -0.30 |
| 1 | 1 | 872 | 1 | 0 | 0.73 | -1.83 | 0.76 | 1.60 | 1.11 |
| 1 | 1 | 873 | 2 | 0.11 | 0.11 | -0.57 | -1.09 | 0.35 | -0.34 |
| 1 | 1 | 874 | 2 | 0.08 | -0.08 | -0.06 | -0.01 | 0.77 | 0.52 |
| 1 | 1 | 875 | 1 | 0 | 1.51 | -0.61 | 0.12 | 0.25 | 1.09 |
| 1 | 1 | 876 | 2 | 0.07 | -1.59 | -0.62 | 0.06 | 0.27 | -1.23 |
| 1 | 1 | 877 | 2 | 0.15 | -0.11 | -1.48 | -1.67 | 0.46 | 0.78 |
| 1 | 1 | 878 | 2 | 0.22 | 0.15 | 0.92 | -0.57 | 1.53 | 2.56 |
| 1 | 1 | 879 | 1 | 0 | -0.28 | -0.75 | 0.42 | -0.99 | -0.56 |
| 1 | 1 | 880 | 1 | 0 | 1.21 | -0.36 | 1.13 | -0.19 | 1.64 |
| 1 | 1 | 881 | 1 | 0 | -1.63 | 0.58 | 0.75 | -1.25 | -1.33 |
| 1 | 1 | 882 | 1 | 0 | 0.64 | -0.09 | 0.82 | 0.64 | 0.11 |
| 1 | 1 | 883 | 1 | 0 | 0.26 | -1.10 | -1.09 | -1.07 | 2.01 |
| 1 | 1 | 884 | 1 | 0 | 1.23 | -1.57 | -0.89 | 1.19 | 0.24 |
| 1 | 1 | 885 | 1 | 0 | 0.13 | -0.23 | 1.31 | 1.50 | 2.51 |
| 1 | 1 | 886 | 1 | 0 | 0.24 | -0.04 | 0.40 | 0.68 | 0.85 |
| 1 | 1 | 887 | 1 | 0 | -0.74 | -0.46 | -0.36 | 0.11 | 1.73 |
| 1 | 1 | 888 | 1 | 0 | 2.32 | -2.28 | -0.03 | -1.57 | -0.98 |
| 1 | 1 | 889 | 3 | 0.09 | -0.25 | -0.02 | -0.97 | 0.57 | -0.39 |
| 1 | 1 | 890 | 2 | 0.14 | -0.05 | -1.82 | -0.66 | 0.96 | 0.84 |
| 1 | 1 | 891 | 1 | 0 | 1.52 | -2.41 | -1.52 | 2.39 | 0.46 |
| 1 | 1 | 892 | 1 | 0 | -1.61 | 1.00 | -0.42 | 0.59 | -3.21 |
| 1 | 1 | 893 | 2 | 0.11 | 1.23 | -0.58 | 0.04 | -1.31 | 0.78 |
| 1 | 1 | 894 | 3 | 0.24 | -1.45 | -0.03 | -1.13 | 0.12 | 0.77 |
| 1 | 1 | 895 | 2 | 0.14 | -1.08 | 0.00 | 1.58 | -0.71 | -1.32 |
| 1 | 1 | 896 | 1 | 0 | 1.08 | -0.35 | 1.18 | 0.29 | -1.13 |
| 1 | 1 | 897 | 1 | 0 | 0.28 | -0.67 | -0.54 | 0.11 | -1.04 |
| 1 | 1 | 898 | 2 | 0.1 | 0.47 | 0.08 | -0.36 | -1.67 | 1.04 |
| 1 | 1 | 899 | 3 | 0.2 | 0.25 | -1.05 | -1.04 | -0.50 | 1.40 |
| 1 | 1 | 900 | 1 | 0 | -1.73 | -1.50 | -0.59 | -1.48 | -0.43 |
| 1 | 1 | 901 | 3 | 0.19 | 0.41 | -1.77 | -0.28 | -1.26 | -0.60 |
| 1 | 1 | 902 | 1 | 0 | 0.73 | -0.64 | -0.93 | 0.62 | -0.38 |
| 1 | 1 | 903 | 2 | 0.14 | -0.58 | -0.44 | -1.18 | 2.01 | 0.56 |
| 1 | 1 | 904 | 1 | 0 | 1.50 | 1.08 | -0.18 | -0.93 | -0.05 |
| 1 | 1 | 905 | 1 | 0 | 0.05 | 0.37 | 2.44 | 0.84 | 2.49 |
| 1 | 1 | 906 | 2 | 0.07 | -1.47 | 0.77 | 0.89 | 0.72 | -0.75 |
| 1 | 1 | 907 | 2 | 0.07 | -1.11 | -0.40 | -0.94 | -0.94 | -1.02 |
| 1 | 1 | 908 | 2 | 0.11 | 1.80 | 0.32 | 0.35 | 0.01 | -0.59 |
| 1 | 1 | 909 | 1 | 0 | -0.93 | 0.60 | 1.12 | 0.35 | -0.03 |
| 1 | 1 | 910 | 1 | 0 | -0.10 | -0.30 | -0.98 | 0.98 | 0.03 |
| 1 | 1 | 911 | 1 | 0 | 0.68 | -1.17 | 0.53 | -1.07 | -0.79 |
| 1 | 1 | 912 | 2 | 0.14 | -0.82 | -0.32 | 0.11 | 1.98 | -0.40 |
| 1 | 1 | 913 | 1 | 0 | 0.65 | -2.11 | 1.02 | 0.34 | -0.93 |
| 1 | 1 | 914 | 1 | 0 | -1.32 | -0.59 | 0.10 | 2.40 | -1.33 |
| 1 | 1 | 915 | 1 | 0 | -2.04 | -0.48 | -0.29 | 1.02 | 0.90 |
| 1 | 1 | 916 | 1 | 0 | 1.91 | -0.70 | 1.03 | -0.10 | -0.11 |
| 1 | 1 | 917 | 2 | 0.14 | -0.42 | 1.94 | -0.61 | 0.78 | -0.40 |
| 1 | 1 | 918 | 1 | 0 | 1.05 | -0.63 | -0.27 | 0.94 | -1.53 |
| 1 | 1 | 919 | 2 | 0.1 | 1.11 | 0.15 | 0.69 | 1.45 | 0.59 |
| 1 | 1 | 920 | 1 | 0 | 0.23 | 0.36 | 1.92 | -1.11 | 0.17 |
| 1 | 1 | 921 | 1 | 0 | 0.22 | 0.00 | 0.53 | -0.55 | -0.14 |
| 1 | 1 | 922 | 2 | 0.15 | 0.95 | 0.49 | -1.54 | 0.60 | 0.13 |
| 1 | 1 | 923 | 1 | 0 | 0.40 | 1.23 | -1.92 | -0.14 | -2.42 |
| 1 | 1 | 924 | 2 | 0.1 | -0.97 | 1.73 | 0.65 | 1.09 | -1.66 |
| 1 | 1 | 925 | 1 | 0 | 1.32 | -0.69 | -0.28 | 1.04 | 1.28 |
| 1 | 1 | 926 | 2 | 0.11 | -1.14 | -1.12 | 0.02 | -0.75 | -1.48 |
| 1 | 1 | 927 | 3 | 0.22 | -0.89 | -0.89 | -1.31 | -0.55 | 0.13 |
| 1 | 1 | 928 | 2 | 0.09 | -1.69 | -0.58 | -0.78 | -0.24 | -0.39 |
| 1 | 1 | 929 | 2 | 0.16 | -1.93 | 0.30 | 0.77 | -1.37 | 1.34 |
| 1 | 1 | 930 | 2 | 0.08 | -0.01 | -1.42 | -0.20 | 0.17 | -0.02 |
| 1 | 1 | 931 | 2 | 0.06 | 0.75 | 0.84 | -0.09 | -0.10 | 0.85 |
| 1 | 1 | 932 | 2 | 0.13 | 0.08 | 0.80 | -0.57 | -1.33 | 0.95 |
| 1 | 1 | 933 | 2 | 0.1 | 0.45 | 0.87 | 1.01 | 0.79 | -0.51 |
| 1 | 1 | 934 | 2 | 0.05 | -0.09 | -0.39 | 0.47 | -1.00 | 1.46 |
| 1 | 1 | 935 | 1 | 0 | 0.72 | -1.55 | 0.45 | -0.77 | 0.64 |
| 1 | 1 | 936 | 1 | 0 | 0.24 | 1.24 | 1.08 | 0.24 | 1.22 |
| 1 | 1 | 937 | 3 | 0.11 | -0.69 | -0.28 | 0.23 | -0.21 | 0.37 |
| 1 | 1 | 938 | 1 | 0 | -0.79 | 0.18 | 0.72 | 0.34 | -1.27 |
| 1 | 1 | 939 | 3 | 0.14 | 0.58 | -1.23 | -0.39 | 1.41 | 0.00 |
| 1 | 1 | 940 | 2 | 0.06 | -0.11 | 0.40 | -0.18 | -0.82 | 0.49 |
| 1 | 1 | 941 | 2 | 0.08 | 0.59 | 0.26 | -0.04 | 0.15 | -0.56 |
| 1 | 1 | 942 | 1 | 0 | 1.86 | -0.96 | 1.42 | -2.10 | -0.16 |
| 1 | 1 | 943 | 1 | 0 | 1.06 | -0.88 | -0.90 | -1.20 | -0.62 |
| 1 | 1 | 944 | 2 | 0.12 | 0.52 | 0.59 | -1.86 | -0.09 | -0.65 |
| 1 | 1 | 945 | 2 | 0.14 | -1.15 | -0.83 | -1.52 | 0.40 | 0.89 |
| 1 | 1 | 946 | 1 | 0 | -0.73 | -1.09 | 1.07 | -1.22 | -0.17 |
| 1 | 1 | 947 | 2 | 0.12 | 1.05 | 0.73 | -0.58 | -0.97 | 1.20 |
| 1 | 1 | 948 | 1 | 0 | -1.51 | -1.57 | -0.58 | 1.08 | 0.10 |
| 1 | 1 | 949 | 1 | 0 | -1.21 | 2.27 | -0.57 | 0.69 | 0.05 |
| 1 | 1 | 950 | 1 | 0 | 0.68 | 0.11 | -0.19 | 0.41 | -0.17 |
| 1 | 1 | 951 | 1 | 0 | 1.48 | -0.49 | -2.73 | -2.02 | -1.94 |
| 1 | 1 | 952 | 1 | 0 | -0.73 | -0.73 | 0.27 | -0.14 | -0.16 |
| 1 | 1 | 953 | 3 | 0.13 | -0.87 | -0.86 | -1.52 | -0.04 | -0.60 |
| 1 | 1 | 954 | 1 | 0 | -1.66 | 0.11 | -0.80 | -2.93 | -0.69 |
| 1 | 1 | 955 | 2 | 0.09 | 0.13 | -0.85 | 0.32 | -1.00 | 0.67 |
| 1 | 1 | 956 | 1 | 0 | 1.39 | -1.21 | 0.42 | -1.37 | -2.71 |
| 1 | 1 | 957 | 2 | 0.11 | -0.22 | 0.00 | 0.61 | -0.53 | 1.13 |
| 1 | 1 | 958 | 1 | 0 | 0.13 | -0.99 | 0.04 | -0.52 | 0.71 |
| 1 | 1 | 959 | 1 | 0 | 0.88 | 0.48 | -2.57 | -1.15 | -0.91 |
| 1 | 1 | 960 | 1 | 0 | -1.45 | -1.78 | 0.79 | 1.23 | 0.27 |
| 1 | 1 | 961 | 1 | 0 | -0.84 | -0.77 | 0.61 | -2.40 | 0.44 |
| 1 | 1 | 962 | 2 | 0.07 | -0.02 | 1.22 | 0.35 | 0.68 | 0.01 |
| 1 | 1 | 963 | 2 | 0.06 | -1.51 | 0.20 | -1.70 | 1.66 | -0.25 |
| 1 | 1 | 964 | 1 | 0 | -0.10 | 0.02 | -2.32 | -1.03 | -0.39 |
| 1 | 1 | 965 | 2 | 0.09 | -1.69 | 1.34 | 0.31 | 0.29 | 0.41 |
| 1 | 1 | 966 | 2 | 0.12 | 1.47 | -1.80 | 0.67 | -1.55 | 0.09 |
| 1 | 1 | 967 | 1 | 0 | 0.97 | -0.49 | -0.96 | -0.96 | 1.78 |
| 1 | 1 | 968 | 1 | 0 | 2.21 | 0.97 | -0.26 | 0.21 | 0.28 |
| 1 | 1 | 969 | 3 | 0.22 | 0.32 | 1.11 | 1.38 | -1.70 | 0.73 |
| 1 | 1 | 970 | 1 | 0 | -2.47 | 1.74 | -0.96 | 1.16 | -0.85 |
| 1 | 1 | 971 | 1 | 0 | 0.65 | -0.90 | 1.16 | -1.59 | 0.58 |
| 1 | 1 | 972 | 1 | 0 | -0.33 | 1.11 | 0.45 | 0.50 | -1.13 |
| 1 | 1 | 973 | 1 | 0 | 1.12 | 0.58 | 0.40 | -0.61 | -1.41 |
| 1 | 1 | 974 | 3 | 0.17 | -1.08 | -1.60 | -1.01 | -0.14 | 0.08 |
| 1 | 1 | 975 | 2 | 0.06 | 0.43 | -1.02 | -1.58 | -0.13 | 0.82 |
| 1 | 1 | 976 | 1 | 0 | 0.70 | 0.23 | 0.97 | 1.81 | 0.74 |
| 1 | 1 | 977 | 1 | 0 | -1.01 | -0.51 | 0.00 | 0.77 | 1.28 |
| 1 | 1 | 978 | 1 | 0 | -0.09 | 2.89 | 1.76 | -0.54 | 0.19 |
| 1 | 1 | 979 | 1 | 0 | 0.43 | -0.22 | 1.38 | 1.26 | 1.03 |
| 1 | 1 | 980 | 2 | 0.17 | 1.04 | -1.27 | 0.02 | -1.42 | -1.72 |
| 1 | 1 | 981 | 2 | 0.06 | -1.32 | -0.28 | 0.48 | 0.00 | -0.23 |
| 1 | 1 | 982 | 2 | 0.12 | 1.41 | 1.29 | 0.68 | -0.97 | 0.60 |
| 1 | 1 | 983 | 2 | 0.04 | -0.18 | 0.57 | -0.40 | 1.13 | 0.38 |
| 1 | 1 | 984 | 2 | 0.1 | 0.83 | -0.79 | 0.08 | -0.12 | -1.46 |
| 1 | 1 | 985 | 2 | 0.11 | 0.89 | 0.35 | 1.11 | -0.40 | -1.82 |
| 1 | 1 | 986 | 1 | 0 | -0.91 | -0.50 | -1.01 | -1.22 | -2.26 |
| 1 | 1 | 987 | 2 | 0.06 | 0.28 | -0.29 | -0.18 | 0.43 | -0.29 |
| 1 | 1 | 988 | 1 | 0 | -0.62 | 0.43 | -1.98 | 1.16 | -0.72 |
| 1 | 1 | 989 | 1 | 0 | 1.08 | -0.92 | 0.12 | 0.38 | 1.56 |
| 1 | 1 | 990 | 1 | 0 | -0.33 | -0.20 | 0.34 | -1.89 | -0.07 |
| 1 | 1 | 991 | 1 | 0 | 0.56 | 1.93 | 0.17 | -0.51 | 2.64 |
| 1 | 1 | 992 | 1 | 0 | 2.36 | -1.97 | -0.82 | 1.02 | 0.66 |
| 1 | 1 | 993 | 1 | 0 | -0.72 | 0.15 | -1.48 | 0.16 | 0.23 |
| 1 | 1 | 994 | 2 | 0.11 | 2.14 | 0.00 | 0.09 | 2.32 | 0.02 |
| 1 | 1 | 995 | 1 | 0 | 2.28 | 0.62 | -0.26 | 0.99 | 1.61 |
| 1 | 1 | 996 | 2 | 0.13 | -0.83 | -0.34 | -0.52 | -0.06 | -1.68 |
| 1 | 1 | 997 | 1 | 0 | 0.27 | 0.62 | -1.16 | -0.18 | 1.17 |
| 1 | 1 | 998 | 2 | 0.08 | -0.35 | 0.87 | 0.36 | -0.51 | 0.10 |
| 1 | 1 | 999 | 3 | 0.17 | -0.28 | -1.24 | -1.02 | -1.26 | 1.21 |
| 1 | 1 | 1000 | 1 | 0 | -0.31 | -0.69 | -0.38 | 1.29 | -0.69 |
| 1 | 1 | 1001 | 1 | 0 | 0.14 | 1.34 | -1.62 | -0.54 | 1.57 |
| 1 | 1 | 1002 | 2 | 0.08 | -0.43 | 0.76 | -0.38 | -0.30 | 0.14 |
| 1 | 1 | 1003 | 1 | 0 | 1.35 | 0.46 | 1.54 | -0.24 | -1.05 |
| 1 | 1 | 1004 | 2 | 0.14 | -1.15 | 1.03 | -0.61 | 0.54 | 0.61 |
| 1 | 1 | 1005 | 1 | 0 | -0.25 | 0.44 | 0.55 | -1.10 | -0.94 |
| 1 | 1 | 1006 | 2 | 0.17 | 1.20 | -0.42 | 1.45 | 1.58 | -0.59 |
| 1 | 1 | 1007 | 1 | 0 | 0.85 | -0.83 | -2.36 | 1.60 | 0.00 |
| 1 | 1 | 1008 | 3 | 0.15 | -1.02 | -2.07 | -0.69 | 0.07 | -0.04 |
| 1 | 1 | 1009 | 1 | 0 | 2.42 | -0.43 | 0.51 | 0.25 | 0.04 |
| 1 | 1 | 1010 | 3 | 0.1 | 0.70 | -0.66 | 0.83 | -0.59 | -0.71 |
| 1 | 1 | 1011 | 2 | 0.14 | -0.96 | -0.39 | 0.90 | 1.82 | -1.43 |
| 1 | 1 | 1012 | 3 | 0.08 | 0.15 | -0.42 | 1.31 | -0.55 | 0.06 |
| 1 | 1 | 1013 | 2 | 0.12 | 1.55 | 0.03 | -0.33 | -1.03 | -0.08 |
| 1 | 1 | 1014 | 2 | 0.11 | -0.62 | -1.25 | -0.54 | 0.93 | -0.03 |
| 1 | 1 | 1015 | 1 | 0 | 0.80 | 1.58 | -1.95 | 1.19 | 0.12 |
| 1 | 1 | 1016 | 2 | 0.12 | -0.19 | -1.30 | 1.47 | 0.75 | 0.17 |
| 1 | 1 | 1017 | 1 | 0 | -0.47 | 0.58 | 0.22 | 0.14 | -1.88 |
| 1 | 1 | 1018 | 2 | 0.1 | 1.39 | -0.59 | 0.55 | -0.14 | 0.12 |
| 1 | 1 | 1019 | 3 | 0.12 | -0.91 | 0.93 | -0.36 | -0.27 | 0.04 |
| 1 | 1 | 1020 | 1 | 0 | -0.49 | 0.52 | 0.03 | 1.52 | -1.42 |
| 1 | 1 | 1021 | 1 | 0 | -0.34 | -0.67 | 0.81 | -0.07 | -1.37 |
| 1 | 1 | 1022 | 3 | 0.1 | -0.43 | -0.24 | 0.08 | -0.27 | 0.68 |
| 1 | 1 | 1023 | 1 | 0 | -0.04 | 0.33 | 1.65 | -2.80 | 0.31 |
| 1 | 1 | 1024 | 2 | 0.14 | -0.92 | -0.78 | 1.91 | -0.17 | -0.23 |
| 1 | 1 | 1025 | 2 | 0.1 | -0.09 | -0.02 | 1.97 | -0.45 | -0.15 |
| 1 | 1 | 1026 | 3 | 0.18 | 0.53 | 0.06 | -0.83 | 0.85 | 1.66 |
| 1 | 1 | 1027 | 2 | 0.09 | 0.62 | 0.27 | -0.90 | -0.24 | -0.78 |
| 1 | 1 | 1028 | 1 | 0 | -1.38 | -0.96 | 0.52 | 1.38 | 0.04 |
| 1 | 1 | 1029 | 2 | 0.11 | 0.84 | -0.67 | 0.63 | -0.47 | 0.21 |
| 1 | 1 | 1030 | 1 | 0 | -0.67 | 1.64 | -0.27 | 0.99 | 0.17 |
| 1 | 1 | 1031 | 3 | 0.13 | -0.24 | 0.25 | 1.46 | 0.60 | -0.41 |
| 1 | 1 | 1032 | 1 | 0 | -0.91 | 0.01 | 0.68 | 1.76 | 1.82 |
| 1 | 1 | 1033 | 1 | 0 | 0.75 | 0.95 | -1.56 | -0.37 | -1.60 |
| 1 | 1 | 1034 | 3 | 0.15 | 0.81 | -0.87 | -1.05 | 0.05 | 0.06 |
| 1 | 1 | 1035 | 1 | 0 | -0.13 | 1.70 | 0.26 | 2.42 | 1.04 |
HVT model diagnostics are used to evaluate the model fit and investigate the proximity between centroids. The distribution of proximity value can also be used to decided an optimum Mean Absolute Deviation threshold for HVT model based prediction.
The diagnosis can be enabled by setting the
diagnose parameter to TRUE
while building the HVT Model.
Model validation is used to measure the fit/quality of the model. Measuring model fit is the key to iteratively improving the models. The relevant measure of model quality here is percentage of anomalous points. The percentage anomalies ideally should match with level of compression achieved during modeling, where PercentageAnomlies \(\approx\) 1-ModelCompression.
Model Validation can be enabled by setting the
hvt_validation parameter to
TRUE and setting the
train_validation_split_ratio value while training the HVT
Model.
The model trained above has a
train_validation_split_ratio of 0.8, i.e 80% of the Train
dataset is used for training the model while the remaining 20% will be
used for validation
Note: User can skip this step, if the number of observations in train data is low.
The basic tool for examining the model fit is proximity plots and distribution of observations across centroids.
The proximity between object can be measured as distance matrix. The distances between the objects is calculated using Manhattan or Euclidean Distance and put into a matrix form. In the next step we find the minimum value for each row, excluding the diagonal values, as the diagonal elements of distance matrix are zero representing distance from an object to itself. This minimum distance value gives the proximity(distance to nearest neighbour) of other object in the datatable.
plotDiag() function can be used to print diagnostic plots for HVT model or HVT prediction.
The plotDiag() function for HVT Model provides 5 diagnostic plots which are as follows:
Let’s have look at the function plotDiag which we will
use to print the diagnostic plots.
plotDiag(hvt.results)The first diagnostic plot is a calibration plot for HVT Model run on train data. This plot is obtained by scoring(predicting) the train data itself on the HVT model. It is a comparison of Percentage_Anomalies with at varying Mean Absolute Deviation values. It can be seen from the plot that at 0.22 Mean Absolute Deviation value the percentage anomlaies drop below one percent.
p3=hvt.results[[4]]$mad_plot_train+ggtitle("Mean Absolute Deviation Plot: Calibration: HVT Model | Train Data")
p3The second diagnostics plot helps us in finding out how the points in the training data are distributed. Shown below is a histogram of minimum distances with nearest neighbour for each observation in train data.
p1=hvt.results[[4]]$datapoint_plot+ggtitle("Minimum Intra-DataPoint Distance Plot: Train Data")
p1As seen in the plot above the mean value is 0.23. While running our model we had also selected the QE value as 0.2 which is near this value.
The third diagnostics plot helps us in finding out how the centroids in the HVT Model are distributed. Shown below is a histogram of minimum distances with nearest neighbour for each centroid in HVT Model
p2=hvt.results[[4]]$cent_plot+ggtitle("Minimum Intra-Centroid Distance Plot: HVT Model | Train Data")
p2As seen in the plot above the mean value is 0.6. This value can be selected as the Mean Absolute Deviation Threshold for scoring data using predict function
The fourth diagnostics plot finds out distribution of number of observations in each centroid. Shown below is a histogram to depict the same.
p4=hvt.results[[4]]$number_plot+ggtitle("Distribution of Number of Observations in Cells: HVT Model | Train Data")
p4As shown in the plot above the mean number of records in each HVT cell is 2.
The fifth diagnostics plot finds out number of Singleton centroids (Segments/Centroids with single observation.)
p5=hvt.results[[4]]$singleton_piechart
p5The Mean Absolute Deviation Plot for Validation Data has been shown in above section. Alternatively. to fetch it separately, we can use the following code:
m1=hvt.results[[5]][["mad_plot"]]+ggtitle("Mean Absolute Deviation Plot:Validation")
m1As seen in the plot above the suggested threshold for scoring using Mean Absolute Deviation is 0.59 which is very close to Mean of distribution obtained in Minimum Intra-Centroid Distance Plot section. Thus 0.6 (Mean of Minimum Intra-Centroid Distance) can be used as the mean absolute deviation threshold for prediction.
Now once we have built the model, let us try to predict using our test dataset to see which cell each point belongs to.
The prediction algorithm recursively calculates the distance between each point in the test dataset and the cell centroids for each level. The following steps explain the prediction method for a single point in the test dataset :
The user can provide an absolute or relative path in the cell below to access the data from his/her computer.
load_test_data=FALSE
# Loading the data in the Rstudio environment
# Please change the path in the code line below to the path location of the .csv file
if(load_test_data){
file_name <- "hotel_data_test.csv" ## Single_hotel_Time_Series.csv,HotelPanel_100.csv
file_path <- "./sample_dataset/"
file_load <- paste0(file_path, file_name)
dataset_updated_test <- as.data.frame(fread(file_load))
if(nrow(dataset_updated_test) > 0){
paste0("File ", file_name, " having ", nrow(dataset_updated_test), " row(s) and ", ncol(dataset_updated_test), " column(s)", " imported successfully. ") %>% cat("\n")
# Round only the numeric columns in dataset
dataset_updated_test <- dataset_updated_test %>% mutate_if(is.numeric, round, digits = 4)
paste0("Code chunk executed successfully. Below table showing first 10 row(s) of the dataset.") %>% cat("\n")
# Display imported dataset
dataset_updated_test %>% head(10) %>%
as.data.frame() %>%
DT::datatable(options = options, rownames = TRUE)
}
colnames( dataset_updated_test) <- colnames( dataset_updated_test) %>% casefold()
dataset_updated_test <- spaceless( dataset_updated_test)
}In this section we will perform one hot encoding on test dataset, based on whether one hot encoding has been performed on train dataset.
In this section we will subset the test data based on numeric columns present in train data.
dataset_updated_test=dataset_updated_test %>% dplyr::select(nums)Now once we have the test data ready, lets look at how the predictHVT function looks like.
predictHVT(data,
hvt.results,
hmap.cols = NULL,
child.level = 1,
mad.threshold
...)
The important parameters for the function predictHVT are
as below
data - A dataframe containing the
test dataset. The dataframe should have atleast one variable used for
training. The variables from this dataset can also be used to overlay as
heatmap
hvt.results - A list of hvt.results
obtained from the HVT function while performing hierarchical vector
quantization on training data
hmap.cols - The column number of
column name from the dataset indicating the variables for which the heat
map is to be plotted. A heatmap won’t be plotted if NULL is passed
(Default = NULL)
child.level - A number indicating
the level for which the heat map is to be plotted (Only used if
hmap.cols is not NULL)
mad.threshold - A threshold
indicating the level for which the heat map is to be plotted
... - color.vec and line.width can
be passed from here
Here the mad_threshold has been selected as 0.6 which is based on Mean of Minimum Intra-Centroid Distance, and verified by Validation Mean Absolute Deviation Plot
hvt.prediction = list()
mad_threshold= 0.6 #Mean of Minimum Intra-Centroid Distance
hvt.prediction <- muHVT::predictHVT(
data = dataset_updated_test,
hvt.results.model=hvt.results,
child.level = 1,
mad.threshold = mad_threshold,
line.width = c(0.6, 0.4, 0.2),
color.vec = c("#141B41", "#6369D1", "#D8D2E1"),
distance_metric = "L1_Norm",
error_metric = "max"
)The plotDiag() function can be called for prediction object as well. Shown below is the comparison of Mean Absolute Deviation Plot for train validation and test data.
plotDiag(hvt.prediction)QEdata=hvt.results[[3]]$summary
Quant.Error.Actual <- QEdata %>%
mutate(Quant.Error = Quant.Error * n) %>%
select(Quant.Error, n) %>%
summarise_all(sum) %>%
transmute(Quant.Error = Quant.Error/ n) %>%
unlist()%>%round(4)
# predictions[["predictPlot"]]
#
# Table(predictions$predictions %>% dplyr::relocate(Cell_path), scroll = T, limit = 10)Table below shows cell(s) containing anomalous test data points. Datapoints are flagged as anomalous when Quantization Error for such is greater than the same of assigned centroid based on error metric. Comparison between scored/test and fitted Quantization error of cell(s) is provided for further insights.
QECompareDf <- hvt.prediction$QECompareDf %>% filter(anomalyFlag == 1)
percentageAnomalies = formatC((sum(QECompareDf$n) / nrow(dataset_updated_test)) *
100, digits = 2, format = "f") %>% paste0("%")Number of test data points:500 | Number of anomalous data
points:2 | Percentage of anomalous data points: 0.40%
Mean QE for fitted data: 0.078 | Mean QE for test data: 0.2502 |
Difference in QE between fitted and test data: 0.1722
qeHistPlot(hvt.results,hvt.prediction)The anomalous observations are shown below in the datatable.
QECompareDf <- hvt.prediction$QECompareDf %>% filter(anomalyFlag == 1)
# adding cell ID to table
QECompareDf1 <-
left_join(
QECompareDf,
hvt.results[[3]]$summary,
by = c("Segment.Level", "Segment.Parent", "Segment.Child")
)
QECompareDf1 <-
QECompareDf1 %>% select(
"Segment.Level",
"Segment.Parent",
"Segment.Child",
# "Cell.ID",
"anomalyFlag",
"n.x" ,
"Fitted.Quant.Error" ,
"Scored.Quant.Error" ,
"Quant.Error.Diff",
"Quant.Error.Diff (%)"
)
colnames(QECompareDf1) <- c(
"Segment.Level",
"Segment.Parent",
"Segment.Child",
# "Cell.ID",
"anomalyFlag",
"n" ,
"Fitted.Quant.Error" ,
"Scored.Quant.Error" ,
"Quant.Error.Diff",
"Quant.Error.Diff (%)"
)
QECompareDf1$flag <-
ifelse(QECompareDf1$anomalyFlag == 1, 1, 0)
DT::datatable(
QECompareDf1 %>%
mutate_if(is.numeric, ~ round(., 4)) %>%
select(-c(`Quant.Error.Diff (%)`)) %>%
rename(No.Of.Points = n),
class = 'cell-border stripe',
rownames = FALSE,
filter = "top",
escape = FALSE,
selection = "none",
options = options,
callback = htmlwidgets::JS(
"var tips = ['Segment Level based on the hierarchical structure of the muHVT model output',
'Segment Parent based on the hierarchical structure of the muHVT model output',
'Segment Child based on the hierarchical structure of the muHVT model output',
'Cell ID based on the hierarchical structure of the muHVT model output',
'Flag indicating whether data points are anomalous or not',
'Number of anomalous scored data points for each centroid in the muHVT map',
'Quantization Error for the highlighted cell built on the fitted data',
'Quantization Error for the highlighted cell built on the scored data',
'Change in Quantization Error between scored and the fitted model'],
header = table.columns().header();
for (var i = 0; i < tips.length; i++) {
$(header[i]).attr('title', tips[i]);
}"
)
) %>%
formatStyle('flag',
target = 'row',
backgroundColor = styleEqual(c(1), c('#ff7f7f')))The interactive plot below shows anomalous cell(s) with color indicating change in Quantization Error after scoring.
hvt.prediction[["predictPlot"]]The predictions from the above sections can be downloaded in section below. The downloaded predictions can be found in the output(LOCAL) folder.
predictClusterData <- hvt.prediction[["scoredPredictedData"]]%>%as.data.frame()
predictClusterData %>% head(100) %>% round(2)%>%
as.data.frame() %>%
DT::datatable(options = options, rownames = TRUE)Pricing Segmentation - The package can be used to discover groups of similar customers based on the customer spend pattern and understand price sensitivity of customers
Market Segmentation - The package can be helpful in market segmentation where we have to identify micro and macro segments. The method used in this package can do both kinds of segmentation in one go
Anomaly Detection - This method can help us categorize system behaviour over time and help us find anomaly when there are changes in the system. For e.g. Finding fraudulent claims in healthcare insurance
The package can help us understand the underlying structure of the data. Suppose we want to analyze a curved surface such as sphere or vase, we can approximate it by a lot of small low-order polygons in the form of tessellations using this package
In biology, Voronoi diagrams are used to model a number of different biological structures, including cells and bone microarchitecture
Using the base idea of Systems Dynamics, these diagrams can also be used to depict customer state changes over a period of time
Vector Quantization : https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-450-principles-of-digital-communications-i-fall-2006/lecture-notes/book_3.pdf
Sammon’s Projection : http://en.wikipedia.org/wiki/Sammon_mapping
Voronoi Tessellations : http://en.wikipedia.org/wiki/Centroidal_Voronoi_tessellation